WIKINDX

WIKINDX Resources

Xie, X., Mentor, •., & Khosla, S. BERT Finetuning Analysis. 
Resource type: Journal Article
BibTeX citation key: anon.184
View all bibliographic details
Categories: General
Creators: Khosla, Mentor, Xie
Attachments   URLs   https://www.semant ... 6fe938ff37d05a4455
Abstract
The experiments were able to achieve improvements from baseline scores by almost three folds, and demonstrate that further improvements can be easily attained with better finetuning or other techniques outside of this paper’s scope. BERT is pretrained to be a generalist at language representation. It presents itself as a great stepping stone for investigations into increasing its accuracy for more specific tasks, like sentiment analysis, paraphrase detection, and semantic textual similarity. The goal is to experiment with different architectures for each task, loss functions, and fine-tuning techniques in order to build a high performing model across all three subtasks. Since the contributions were mostly in finetuning, the findings make it clear that without further pretraining where BERT gets direct access to task-specific data to update its parameters, finetuning’s limitations require much more work to be comparable. The experiments were able to achieve improvements from baseline scores by almost three folds. These promising results demonstrate that further improvements can be easily attained with better finetuning or other techniques outside of this paper’s scope.
  
Notes
[Online; accessed 1. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)