WIKINDX

WIKINDX Resources

Behrman, Z., & Christoglou, G. Evaluating Contrastive Learning Strategies for Enhanced Performance in Downstream Tasks. 
Resource type: Journal Article
BibTeX citation key: anonn
View all bibliographic details
Categories: General
Creators: Behrman, Christoglou
Attachments   URLs   https://www.semant ... tm_medium=33014503
Abstract
This paper aims to implement unsupervised SimCSE, supervised SimCSE and DiffCSE, alongside extensions using data augmentation, and develop classification heads for textual similarity, sentiment analysis, and paraphrase detection. Sentence embeddings are an important, but challenging domain of NLP. Recent developments with self-attention, transformers, and BERT have allowed for a new learned embedding paradigm. While these methods have made significant strides, our paper aims to survey the recent advancements of contrastive learning on embedding quality. Within this paper, we first aim to implement unsupervised SimCSE, supervised SimCSE and DiffCSE, alongside extensions using data augmentation. With these pretraining strategies defined, we additionally develop classification heads for textual similarity, sentiment analysis, and paraphrase detection. Finally, We assess each pretraining strategy on the downstream classification heads, with the goal of finding the most performant architecture. Through our experimentation, we found that the author’s version of supervised SimCSE + our developed classification heads performed best, placing us on the top ∼ 13 of leaderboard submissions.
  
Notes
[Online; accessed 25. May 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)