WIKINDX

WIKINDX Resources

Alsuhaibani, M. Deep Learning-based Sentence Embeddings using BERT for Textual Entailment. 
Resource type: Journal Article
BibTeX citation key: anonf
View all bibliographic details
Categories: General
Creators: Alsuhaibani
Attachments   URLs   https://www.semant ... 0b4cc0c03a41612c48
Abstract
Experimental results revealed that the L 2 norm of sentence embeddings, drawn specifically from BERT’s 7th layer, emerged superior in entailment detection compared to other setups. —This study directly and thoroughly investigates the practicalities of utilizing sentence embeddings, derived from the foundations of deep learning, for textual entailment recognition, with a specific emphasis on the robust BERT model. As a cornerstone of our research, we incorporated the Stanford Natural Language Inference (SNLI) dataset. Our study emphasizes a meticulous analysis of BERT’s variable layers to ascertain the optimal layer for generating sentence embeddings that can effectively identify entailment. Our approach deviates from traditional methodologies, as we base our evaluation of entailment on the direct and simple comparison of sentence norms, subsequently highlighting the geometrical attributes of the embeddings. Experimental results revealed that the L 2 norm of sentence embeddings, drawn specifically from BERT’s 7th layer, emerged superior in entailment detection compared to other setups.
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)