WIKINDX

WIKINDX Resources  

Cs224n, S., Project, D., Chin, J. R., Socana, M. O., Chiruvolu, J., & Yang, C. Improving MiniBERT’s Semantic Performance with Semantic-rich Sentence Embeddings. 
Resource type: Journal Article
BibTeX citation key: anon.42
View all bibliographic details
Categories: General
Creators: Chin, Chiruvolu, Cs224n, Project, Socana, Yang
Attachments   URLs   https://www.semant ... 8da98a203589596093
Abstract
An Sentence-BERT (SBERT) transformer model is developed that achieves performance on sentiment classification, paraphrase detection, and semantic textual similarity tasks comparable to BERT. We have developed an Sentence-BERT (SBERT) transformer model that achieves performance on sentiment classification, paraphrase detection, and semantic textual similarity (STS) tasks comparable to BERT. SBERT is a method for generating high-quality sentence embeddings to provide a way for machines to understand the meaning of sentences and to compare them with one another. Theoretically, SBERT should garner higher performance on tasks that require a model to view multiple sentence embeddings such as in paraphrase detection and STS. In addition to SBERT, we have implemented various significant changes both to the model and training algorithms to increase performance and to experiment with more advanced techniques. Experiments with our implementation of SBERT and subsequent additions have shown slightly better performance on tasks that can benefit from sentence embeddings. Other additions to the standard baseline BERT model include experimenting with L2 regularization, adversarial regularization, additional datasets, and loss functions for their respective tasks.
  
Notes
[Online; accessed 1. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)