Parameter-Lite Adapter for Dynamic Entity Alignment

Published: 01 Jan 2022, Last Modified: 24 Jun 2025PRICAI (1) 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Entity alignment (EA) aims to link entities referring to the same real-world identity from different knowledge graphs (KGs). Most existing EA methods focus on static KGs, while practical graphs are growing and changing over time. Although some EA methods study dynamic settings to suit the changes, they perform suboptimal as they are unaware of knowledge oblivion and the prohibitive model size. To address the above issues, we propose a Parameter-Lite dynamic Entity Alignment model (PLEA), which leverages prior knowledge to embed entities and even represent unseen entities. We design a novel lightweight module that only trains a small number of parameters added by the adapter and keeps the original network fixed, so as to retain knowledge from previous snapshots with low computational cost. As for unseen entities, we design a regularized entity mapping mechanism to inject prior knowledge into unseen entity embeddings to improve representation ability. The experimental results on three real-world datasets demonstrate that our proposed PLEA archives up to 4% accuracy with only 50% of the number of parameters, compared with existing state-of-art methods.
Loading