BRENT: Bidirectional Retrieval Enhanced Norwegian TransformerDownload PDF

Published: 20 Mar 2023, Last Modified: 29 Aug 2024NoDaLiDa 2023Readers: Everyone
Abstract: Retrieval-based language models are increasingly employed in question-answering tasks. These models search in a corpus of documents for relevant information instead of having all factual knowledge stored in its parameters, thereby enhancing efficiency, transparency, and adaptability. We develop the first Norwegian retrieval-based model by adapting the REALM framework and evaluate it on various tasks. After training, we also separate the language model, which we call the \textit{reader}, from the retriever components, and show that this can be fine-tuned on a range of downstream tasks. Results show that retrieval augmented language modeling improves the reader's performance on extractive question-answering, suggesting that this type of training improves language models' general ability to use context and that this does not happen at the expense of other abilities such as part-of-speech tagging, dependency parsing, named entity recognition, and lemmatization. Code, trained models, and data are made publicly available.
Student Paper: Yes, the first author is a student
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 7 code implementations](https://www.catalyzex.com/paper/brent-bidirectional-retrieval-enhanced/code)
4 Replies

Loading