Causal Inference for Knowledge Graph CompletionDownload PDF

Published: 01 Feb 2023, Last Modified: 13 Feb 2023Submitted to ICLR 2023Readers: Everyone
Keywords: Causal Inference, Knowledge Graph Completion
TL;DR: We propose causal KGC models to alleviate the issues by leveraging causal inference framework.
Abstract: The basis of existing knowledge graph completion (KGC) models is to learn the correlations in data, such as the correlation between entities or relations and scores of triplets. Since the world is driven by causality rather than correlation, correlation-driven KGC models are weak in interpretation and suffer from the data bias issue. In this paper, we propose causal KGC models to alleviate the issues by leveraging causal inference framework. Our models are intuitive and interpretable by utilizing causal graphs, controllable by using intervention techniques and model-agnostic. Causal graphs allow us to explain the causal relationships between variables and the data generation process. Under the causal graph, data bias can be seen as confounders. Then we block the bad effect of confounders by intervention operators to mitigate the data bias issue. Due to the difficulty of obtaining randomized data, causal KGC models pose unique challenges for evaluation. Thus, we show a method that makes evaluation feasible. Finally, we show a group theory view for KGC, which is equivalent to the view of causal but further reveals the relationships between causal graphs. Experimental results show that our causal KGC models achieve better performance than traditional KGC models.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Applications (eg, speech processing, computer vision, NLP)
Supplementary Material: zip
10 Replies

Loading