Yao, T. Improving Semantic Meaning of BERT Sentence Embeddings.
|
 |
|
Abstract
|
The results indicate that the SBERT model provides a solid foundation for improving the semantic meaning of the output sentence embeddings in the multitask domain, yielding a higher ability to be applied to other tasks and datasets compared to the standard BERT model. SBERT has been shown to improve the performance of BERT on downstream tasks, such as STS, by deriving semantically meaningful sentence embeddings from the BERT output. We examine whether SBERT fine-tuning is also effective in a multitask setting. Specifically, we investigate if fine-tuning the BERT model using the SBERT method can improve the performance on the three downstream tasks of sentiment analysis, paraphrase detection, and STS simultaneously. We experiment with different combinations of pooling strategies and fine-tuning methods. The results indicate that the SBERT model provides a solid foundation for improving the semantic meaning of the output sentence embeddings in the multitask domain, yielding a higher ability to be applied to other tasks and datasets compared to the standard BERT model
|
| Notes |
[Online; accessed 29. Aug. 2024]
|