WIKINDX

WIKINDX Resources

Swathi, K., & Reddy, K. Dr. V. SOLVING SENTIMENT ANALYSIS USING TRANSFORMER BASED BERT. 
Resource type: Journal Article
BibTeX citation key: anon.160
View all bibliographic details
Categories: General
Creators: Reddy, Swathi
Attachments   URLs   https://www.semant ... tm_medium=30248492
Abstract
For a wide variety of tasks, such as answering questions and making inferences about languages, the pre-trained BERT model may be fine-tuned to create cutting-edge models with just one extra output layer. The most prevalent sequence transduction models use neural networks containing encoders and decoders. The better versions also have an attention mechanism that connects the encoder and decoder. An innovative language representation paradigm, BERT stands for bidirectional Encoder Representations from Transformers. Because it takes into account both the left and right sides of the text at every level, BERT can pre-train deep bidirectional representations from unlabeled text. For a wide variety of tasks, such as answering questions and making inferences about languages, the pre-trained BERT model may be fine-tuned to create cutting-edge models with just one extra output layer. It's easy to use, and it's backed by a tonne of data from the scientific community.
  
Notes
[Online; accessed 25. May 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)