WIKINDX

WIKINDX Resources  

Cs224n, S., Project, D., Dushime, A., & Mungu, W. BERT Extension Using Sentence-BERT for Sentence Embedding. 
Resource type: Journal Article
BibTeX citation key: anon.36
View all bibliographic details
Categories: General
Creators: Cs224n, Dushime, Mungu, Project
Attachments   URLs   https://www.semant ... tm_medium=34898166
Abstract
The aim of this project is to implement extensions to improve the performance of the BERT model in the three downstream tasks, by implementing a round-robin approach that combines the losses of the three tasks so that the model trains and improves on all three at the same time. The aim of this project is to implement extensions to improve the performance of the BERT model in the three downstream tasks. To do this, I first implemented a round-robin approach that combines the losses of the three tasks so that the model trains and improves on all three at the same time: sentiment analysis, paraphrase detection and semantic textual similarity. Afterwards, I targeted improving the underlying representations of the sentence embeddings which are the backbones used in the three tasks, by implementing a siamese network. The idea is that, since these tasks depend on the embeddings produced by BERT, improving these embeddings would improve the performance on these tasks, with other modifications. The project specifically targeted the semantic text similarity task, although as we will come to see, I generally got an improvement in all three tasks.
  
Notes
[Online; accessed 3. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)