WIKINDX

WIKINDX Resources

Campos, D. F., Zhai, C., & Magnani, A. CAPOT: Creating Robust Dense Query Encoders using Post Training Contrastive Alignment. 
Resource type: Journal Article
BibTeX citation key: anonr
View all bibliographic details
Categories: General
Creators: Campos, Magnani, Zhai
Attachments   URLs   https://www.semant ... 6d5b899e0a6e30877d
Abstract
CAPOT enables robust retrieval by freezing the document encoder while the query encoder learns to align noisy queries with their unaltered root, and has a similar impact as data augmentation with none of its overhead. The success of contextual word representations and advances in neural information retrieval have made dense vector-based retrieval a standard approach for passage and document ranking. While effective and efficient, dual-encoders are brittle to variations in query distributions and noisy queries. Data augmentation can make models more robust but introduces overhead to training set generation and requires retraining and index regeneration. We present Contrastive Alignment POst Training (CAPOT), a highly efficient finetun-ing method that improves model robustness without requiring index regeneration, the training set optimization, or alteration. CAPOT enables robust retrieval by freezing the document encoder while the query encoder learns to align noisy queries with their unaltered root. We evaluate CAPOT noisy variants of MS-MARCO, Natural Questions, and Trivia QA passage retrieval, finding CAPOT has a similar impact as data augmentation with none of its overhead.
  
Notes
[Online; accessed 1. Jun. 2024]
  
WIKINDX 6.11.0 | Total resources: 209 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)