$Great~Truths~are ~Always ~Simple:$ A Rather Simple Knowledge Encoder for Enhancing the Commonsense Reasoning Capacity of Pre-Trained ModelsDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=8v_Uc_w4vUt
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Commonsense reasoning in natural language is a desired ability of artificial intelligent systems. For solving complex commonsense reasoning tasks, a typical solution is to enhance pre-trained language models~(PTMs) with a knowledge-aware graph neural network~(GNN) encoder that models a commonsense knowledge graph~(CSKG). Despite the effectiveness, these approaches are built on heavy architectures, and can't clearly explain how external knowledge resources improve the reasoning capacity of PTMs. Considering this issue, we conduct a deep empirical analysis, and find that it is indeed \emph{relation features} from CSKGs (but not \emph{node features}) that mainly contribute to the performance improvement of PTMs. Based on this finding, we design a simple MLP-based knowledge encoder that utilizes statistical relation paths as features. Extensive experiments conducted on five benchmarks demonstrate the effectiveness of our approach, which also largely reduces the parameters for encoding CSKGs. Our codes and data are publicly available at~\url{https://github.com/RUCAIBox/SAFE}.
Presentation Mode: This paper will be presented virtually
Copyright Consent Signature (type Name Or NA If Not Transferrable): jinhao jiang
Copyright Consent Name And Address: Renmin University of China, 59 Zhongguancun St, Haidian District, Beijing, China, 100872.
0 Replies

Loading