WIKINDX

WIKINDX Resources

Lei, Y., Wu, D., Zhou, T., Shen, T., Cao, Y., & Tao, C., et al. Meta-Task Prompting Elicits Embedding from Large Language Models. 
Resource type: Journal Article
BibTeX citation key: anon.96
View all bibliographic details
Categories: General
Creators: Cao, Lei, Shen, Tao, Wu, Yates, Zhou
Attachments   URLs   https://www.semant ... tm_medium=30248492
Abstract
In this work, we introduce a new unsupervised embedding method, Meta-Task Prompting with Explicit One-Word Limitation (MetaEOL), for generating high-quality sentence embeddings from Large Language Models (LLMs) without the need for model fine-tuning or task-specific engineering. Leveraging meta-task prompting, MetaEOL guides LLMs to produce embeddings through a series of carefully designed prompts that address multiple representational aspects. Our comprehensive experiments demonstrate that embeddings averaged from various meta-tasks yield competitive performance on Semantic Textual Similarity (STS) benchmarks and excel in downstream tasks, surpassing contrastive-trained models. Our findings suggest a new scaling law for embedding generation, offering a versatile, resource-efficient approach for embedding extraction across diverse sentence-centric scenarios.
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)