Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text RetrievalDownload PDF

28 Sep 2020 (modified: 25 Jan 2021)ICLR 2021 PosterReaders: Everyone
  • Keywords: Dense Retrieval, Text Retrieval, Text Representation, Neural IR
  • Abstract: Conducting text retrieval in a dense representation space has many intriguing advantages. Yet the end-to-end learned dense retrieval (DR) often underperforms word-based sparse retrieval. In this paper, we first theoretically show the learning bottleneck of dense retrieval is due to the domination of uninformative negatives sampled locally in batch, which yield diminishing gradient norms, large stochastic gradient variances, and slow learning convergence. We then propose Approximate nearest neighbor Negative Contrastive Learning (ANCE), a learning mechanism that selects hard training negatives globally from the entire corpus, using an asynchronously updated ANN index. Our experiments demonstrate the effectiveness of ANCE on web search, question answering, and in a commercial search environment, showing ANCE dot-product retrieval nearly matches the accuracy of BERT-based cascade IR pipeline, while being 100x more efficient. We also empirically validate our theory that negative sampling with ANCE better approximates the oracle gradient-norm based importance sampling, thus improves the convergence of stochastic training.
  • One-sentence Summary: This paper improves the learning of dense text retrieval using ANCE, which selects global negatives with bigger gradient norms using an asynchronously updated ANN index.
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
  • Supplementary Material: zip
10 Replies

Loading