Keywords: Knowledge Graph Completion, Factorisation-based Models, Message-Passing Graph Neural Network, Inductive Reasoning, Transductive Reasoning
TL;DR: Factorisation-based models like DistMult can be recast as message-passing graph neural networks, and thus can be inductivised by truncating the number of message-passing layers.
Abstract: Factorisation-based Models~(FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion~(KGC) tasks, often outperforming Graph Neural Networks~(GNNs). However, unlike GNNs, FMs struggle to incorporate node features and to generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactorGNNs. This new architecture draws upon \textit{both} modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactorGNNs. Across a multitude of well-established KGC benchmarks, our ReFactorGNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.
Type Of Submission: Extended abstract (max 4 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
PDF File: pdf
Type Of Submission: Extended abstract.
Poster: png
Poster Preview: png
7 Replies
Loading