Distilling Knowledge from Reader to Retriever for Question AnsweringDownload PDF

Published: 12 Jan 2021, Last Modified: 22 Oct 2023ICLR 2021 PosterReaders: Everyone
Keywords: question answering, information retrieval
Abstract: The task of information retrieval is an important component of many natural language processing systems, such as open domain question answering. While traditional methods were based on hand-crafted features, continuous representations based on neural networks recently obtained competitive results. A challenge of using such methods is to obtain supervised data to train the retriever model, corresponding to pairs of query and support documents. In this paper, we propose a technique to learn retriever models for downstream tasks, inspired by knowledge distillation, and which does not require annotated pairs of query and documents. Our approach leverages attention scores of a reader model, used to solve the task based on retrieved documents, to obtain synthetic labels for the retriever. We evaluate our method on question answering, obtaining state-of-the-art results.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We show that attention scores obtained by training a model to answer questions given a set of support documents can be used to train a model to select relevant passages in a knowledge source.
Code: [![github](/images/github_icon.svg) facebookresearch/FiD](https://github.com/facebookresearch/FiD) + [![Papers with Code](/images/pwc_icon.svg) 3 community implementations](https://paperswithcode.com/paper/?openreview=NTEz-6wysdb)
Data: [NarrativeQA](https://paperswithcode.com/dataset/narrativeqa), [Natural Questions](https://paperswithcode.com/dataset/natural-questions), [TriviaQA](https://paperswithcode.com/dataset/triviaqa)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2012.04584/code)
10 Replies

Loading