WIKINDX

WIKINDX Resources  

Cs224n, S., Project, D., Khemka, P., & Casarez, G. BERT With Multitask Fine-Tuning and Loss Construction. 
Resource type: Journal Article
BibTeX citation key: anon.37
View all bibliographic details
Categories: General
Creators: Casarez, Cs224n, Khemka, Project
Attachments   URLs   https://www.semant ... 68644556ac2a0b20ef
Abstract
Through experimentation, it is found that using a weighted multitask loss function, in conjunction with combined similarity task fine-tuning, improved BERT’s performance across the three tasks. Bidirectional Encoder Representations from Transformers, or BERT, is a transformer-based model that generates contextual word representations, that can then be adapted for use on various natural language processing (NLP) tasks. In this study, our goal is to implement a minimalist BERT model to perform sentence-level tasks simultaneously, including sentiment analysis, paraphrase detection, and semantic textual similarity (STS). To approach the problem, we conduct several experiments using BERT embeddings in combination with various techniques leveraged from concepts including similarity task fine-tuning, multitask loss construction, and round robin sampling procedures. Through our experimentation, we found that using a weighted multitask loss function, in conjunction with combined similarity task fine-tuning, improved BERT’s performance across the three tasks. We achieve test performance of 0.526 accuracy for sentiment classification, 0.861 accuracy for paraphrase detection, and a 0.775 Pearson correlation coefficient for STS.
  
Notes
[Online; accessed 1. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)