A Systematic Literature Review of Adapter-based Approaches to Knowledge-enhanced Language Models

ACL ARR 2024 April Submission881 Authors

16 Apr 2024 (modified: 02 May 2024)ACL ARR 2024 April SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Knowledge-enhanced language models (KELMs) have emerged as promising tools to bridge the gap between large-scale language models and domain-specific knowledge. KELMs can achieve higher factual accuracy and mitigate hallucinations by leveraging knowledge graphs (KGs). They are frequently combined with adapter modules to reduce the computational load and risk of catastrophic forgetting. In this paper, we conduct a systematic literature review (SLR) on adapter-based approaches to KELMs. We provide an overview of approaches in the field and explore the strengths and potential shortcomings of the multitude of discovered methods. We show that both general-knowledge and domain-specific approaches have been frequently explored along with various downstream tasks. Furthermore, we discovered that the biomedical domain is the most popular domain-specific field and that the Pfeiffer adapter is the most commonly used adapter type. We outline the main trends and propose promising future directions.
Paper Type: Long
Research Area: Efficient/Low-Resource Methods for NLP
Research Area Keywords: parameter-efficient-training, knowledge-augmented methods, domain adaptation
Contribution Types: Surveys
Languages Studied: English
Submission Number: 881
Loading