Alsuhaibani, M. Deep Learning-based Sentence Embeddings using BERT for Textual Entailment.
|
 |
|
Abstract
|
Experimental results revealed that the L 2 norm of sentence embeddings, drawn specifically from BERT’s 7th layer, emerged superior in entailment detection compared to other setups. —This study directly and thoroughly investigates the practicalities of utilizing sentence embeddings, derived from the foundations of deep learning, for textual entailment recognition, with a specific emphasis on the robust BERT model. As a cornerstone of our research, we incorporated the Stanford Natural Language Inference (SNLI) dataset. Our study emphasizes a meticulous analysis of BERT’s variable layers to ascertain the optimal layer for generating sentence embeddings that can effectively identify entailment. Our approach deviates from traditional methodologies, as we base our evaluation of entailment on the direct and simple comparison of sentence norms, subsequently highlighting the geometrical attributes of the embeddings. Experimental results revealed that the L 2 norm of sentence embeddings, drawn specifically from BERT’s 7th layer, emerged superior in entailment detection compared to other setups.
|