WIKINDX

WIKINDX Resources  

Cs224n, S., Yang, C., Wan, W., Zhao, T., & Lin, Z. minBERT and Extensions over Downstream Tasks. 
Resource type: Journal Article
BibTeX citation key: anon.43
View all bibliographic details
Categories: General
Creators: Cs224n, Lin, Wan, Yang, Zhao
Attachments   URLs   https://www.semant ... fd69096e47f341b45a
Abstract
This project aims to implement the minBERT model and fine-tune it for multiple downstream natural language processing (NLP) tasks, and advanced processing techniques are explored as extensions to improve the results across all downstream tasks simultaneously. Pretrained models, such as Bidirectional Encoder Representations from Trans-formers (BERT), have demonstrated outstanding performance in various natural language understanding tasks. This project aims to implement the minBERT model and fine-tune it for multiple downstream natural language processing (NLP) tasks. In addition, advanced processing techniques are explored as extensions to improve the results across all downstream tasks simultaneously. Specifically, the extensions include sentence concatenation, gradient surgery, and SMART regularization. The best model achieved remarkable performance on the paraphrase detection task, with an accuracy of 87.8\% on the test set, and on the Semantic Textual Similarity (STS) task, with a Pearson correlation coefficient of 0.886 on the test set.
  
Notes
[Online; accessed 1. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)