WIKINDX

WIKINDX Resources

Yao, T. Improving Semantic Meaning of BERT Sentence Embeddings. 
Resource type: Journal Article
BibTeX citation key: anon.187
View all bibliographic details
Categories: General
Creators: Yao
Attachments   URLs   https://www.semant ... 9e12cea466be4b5042
Abstract
The results indicate that the SBERT model provides a solid foundation for improving the semantic meaning of the output sentence embeddings in the multitask domain, yielding a higher ability to be applied to other tasks and datasets compared to the standard BERT model. SBERT has been shown to improve the performance of BERT on downstream tasks, such as STS, by deriving semantically meaningful sentence embeddings from the BERT output. We examine whether SBERT fine-tuning is also effective in a multitask setting. Specifically, we investigate if fine-tuning the BERT model using the SBERT method can improve the performance on the three downstream tasks of sentiment analysis, paraphrase detection, and STS simultaneously. We experiment with different combinations of pooling strategies and fine-tuning methods. The results indicate that the SBERT model provides a solid foundation for improving the semantic meaning of the output sentence embeddings in the multitask domain, yielding a higher ability to be applied to other tasks and datasets compared to the standard BERT model
  
Notes
[Online; accessed 29. Aug. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)