WIKINDX

WIKINDX Resources

Cs224n, S., Project, D., Ayyagari, R., & Hodges, A. We do it BERTer: Comparison of Finetuning Methods to Improve Sentence Embeddings. 
Resource type: Journal Article
BibTeX citation key: anon.49
View all bibliographic details
Categories: General
Creators: Ayyagari, Cs224n, Hodges, Project
Attachments   URLs   https://www.semant ... 0a311c3d20a4852cfc
Abstract
This paper implements an ensemble method combining multi-task finetuning, cosine similarity, and the AdamW optimizer that achieves an overall accuracy of 0.622 on the test leaderboard, which is an improvement on the traditional BERT model’s overall accuracy. We aim to contribute to the growing body of work that finetunes Bidirectional Encoder Representations for Transformers (BERT) (3) for natural language understanding (NLU) tasks by creating a more efficient task-specific model for (1) sentiment analysis, (2) paraphrase detection, and (3) semantic textual similarity. In this paper, we implement a model that achieves an overall accuracy of 0.622 on the test leaderboard, which is an improvement on the traditional BERT model’s overall accuracy. In doing so, we demonstrate that an ensemble method combining multi-task finetuning, cosine similarity, the AdamW optimizer
  
Notes
[Online; accessed 1. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)