WIKINDX

WIKINDX Resources  

Li, X., & Li, J. BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings. 
Resource type: Journal Article
BibTeX citation key: anon.101
View all bibliographic details
Categories: General
Creators: Li, Li
Attachments   URLs   https://www.semant ... be30cfc6b013434295
Abstract
This work proposes a novel model: backward dependency enhanced large language model (BeLLM), which learns sentence embeddings via transforming specific attention layers from uni- to bi-directional and shows that auto-regressive LLMs benefit from backward dependencies for sentence embeddings. Sentence embeddings are crucial in measuring semantic similarity. Most recent studies employed large language models (LLMs) to learn sentence embeddings. Existing LLMs mainly adopted autoregressive architecture without explicit backward dependency modeling. Therefore, we examined the effects of backward dependencies in LLMs for semantic similarity measurements. Concretely, we propose a novel model: backward dependency enhanced large language model (BeLLM). It learns sentence embeddings via transforming specific attention layers from uni- to bi-directional. We extensively experiment across various semantic textual similarity (STS) tasks and downstream applications. BeLLM achieves state-of-the-art performance in varying scenarios. It shows that auto-regressive LLMs benefit from backward dependencies for sentence embeddings.
  
Notes
[Online; accessed 31. May 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)