Keywords: Continual Learning, Knowledge Graphs, Catastrophic Forgetting, Elastic Weight Consolidation, Link Prediction
Abstract: Knowledge graphs (KGs) require continual updates as new information emerges, but neural embedding models suffer from catastrophic forgetting when learning new tasks sequentially. We evaluate Elastic Weight Consolidation (EWC), a regularization-based continual learning method, on KG link prediction using TransE embeddings on FB15k-237. Across 80 experiments with five random seeds, we find that EWC reduces catastrophic forgetting from 12.62\% to 6.85\%, a 45.7\% reduction compared to naïve sequential training. We observe that the task partitioning strategy affects the magnitude of forgetting: semantically coherent tasks exhibit 9.8 percentage points higher forgetting than randomly partitioned tasks (12.62\% vs 2.81\%), suggesting that task construction influences evaluation outcomes. While focused on a single embedding model and dataset, our results demonstrate that EWC effectively mitigates catastrophic forgetting in KG continual learning and highlight the importance of evaluation protocol design.
Submission Number: 9
Loading