Chain-of-Box Empowered Language Models for Logical Reasoning over Knowledge GraphsDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Complex logical reasoning over large-scale knowledge graphs (KGs) is a fundamental yet challenging task. Current approaches mainly focus on embedding logical queries as well as KG entities into the same vector space and retrieving answers based on similarity matching. However, the incompleteness issue of KGs severely hinders the effectiveness of previous studies. To tackle the challenging knowledge deficiency problem, we propose to leverage language models as the additional knowledge reasoner and design a unified framework to integrate knowledge graph reasoning and natural language reasoning by harnessing box embeddings of reasoning trajectory as the chain-of-box and fusing it into the language model to empower the capability of logical reasoning. Extensive experiments on two standard benchmark datasets demonstrate that our model COB-LM significantly improves over state-of-the-art methods.
Paper Type: long
Research Area: NLP Applications
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Publicly available software and/or pre-trained models
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview