GAEL: A Global-Aware Entity Linking Model Based on Retriever-Reader Paradigm

Published: 01 Jan 2024, Last Modified: 04 Mar 2025MLNLP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The field of space science and applications, rich in domain knowledge, has witnessed research on domain knowledge extraction and the preliminary construction of domain knowledge graphs. Entity linking serves as a fundamental component for intelligent applications of knowledge graphs. Recently, the retriever-reader paradigm, inspired by question-answering tasks, has advanced the development of entity linking. However, the existing studies overlook the relationships between entities, resulting in the lack of global consistency in the predictions. Furthermore, they struggle to deal with redundant entities effectively. We propose a Global-Aware Entity Linking Model (GAEL) to address these issues. GAEL employs incremental training during model training, integrating linking results in successive rounds. By incorporating the entity information predicted in previous rounds into new training inputs, GAEL continuously optimizes semantic information and strengthens interactions between candidate entities, which helps to identify potential entities corresponding to mentions. Besides, GAEL utilizes a novel entity screening mechanism that encodes candidate entities and input text separately and ranks them based on semantic similarity, effectively resolving the issue of multiple entities corresponding to the same mention. Experimental results demonstrate that GAEL significantly improves entity linking performance on datasets in the field of space science and applications, as well as on multiple public datasets.
Loading