FastLog: Efficient End-to-end Rule Learning Over Large-scale Knowledge Graphs by Reduction to Vector Operations

ACL ARR 2025 February Submission3017 Authors

15 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Logical rules play a crucial role in the evolution of knowledge graphs (KGs), as they can infer new facts from existing ones while providing explanations. In recent years, end-to-end rule learning has emerged as a promising paradigm to learn logical rules. The key insight of end-to-end rule learning is to transform the rule learning problem in a discrete space into the parameter learning problem in a continuous space, by employing TensorLog operators to simulate the inference of logical rules. However, these TensorLog-based methods struggle with limited scalability in learning rules from large-scale KGs. To improve the efficiency and scalability of end-to-end rule learning, we propose an efficient framework named FastLog for reducing vector-matrix multiplications to vector computations. FastLog is proven to have a lower time complexity than TensorLog. Extensive experimental results on a variety of benchmark KGs demonstrate that FastLog improves the efficiency of end-to-end methods by a significant margin without efficacy degradation in link prediction. Notably, by enhancing with FastLog, existing end-to-end methods are enabled to learn logical rules on two large-scale datasets with up to three hundred million triples, while achieving a high efficacy comparable with the most advanced rule learner within the same training time.
Paper Type: Long
Research Area: Efficient/Low-Resource Methods for NLP
Research Area Keywords: End-to-end rule learning,Knowledge Graph,Link prediction
Contribution Types: NLP engineering experiment, Approaches low compute settings-efficiency, Publicly available software and/or pre-trained models, Theory
Languages Studied: English
Submission Number: 3017
Loading