WIKINDX

WIKINDX Resources

Li, R., Cheng, L., Wang, D., & Tan, J. Siamese BERT Architecture Model with attention mechanism for Textual Semantic Similarity. 
Resource type: Journal Article
BibTeX citation key: anon.107
View all bibliographic details
Categories: General
Creators: Cheng, Li, Tan, Wang
Attachments   URLs   https://www.semant ... 8d768707ba1682b8ea
Abstract
A Siamese Bert network model is proposed to obtain textual semantic similarity and the network structure is applied to three related semantic similarity datasets, which perform better than other approaches. Textual Semantic Similarity is a crucial part of text matching tasks, and it has a very wide range of applications in natural language processing (NLP) tasks such as search engines, question-answering systems, information retrieval, natural language inference. Although there are a variety of approaches about textual semantic similarity, many do not succeed in achieving the semantic representation of a sentence or text that represents it well, and ignore that different words serve different roles in expressing the meaning of the whole sentence in different degrees. Therefore, our paper proposes a Siamese Bert network model to obtain textual semantic similarity. Firstly, we utilize the Bert network model to obtain the semantic features of each word in the sentence as input and utilize the merit of the Siamese network, reducing the training parameters, sharing the same encoder and feature weight information with each other. Then we use the attention mechanism to obtain more advanced semantic features. Furthermore, the similarity between two sentences can be derived by the methods of calculating the distance or concatenating their high-level semantic representations. In this paper, we apply the network structure to three related semantic similarity datasets, which perform better than other approaches.
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)