$Great~Truths~are ~Always ~Simple:$ A Rather Simple Knowledge Encoder for Enhancing the Commonsense Reasoning Capacity of Pre-Trained ModelsDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Commonsense reasoning in natural language is a desired capacity of artificial intelligent systems. For solving complex commonsense reasoning tasks, a typical approach is to enhance pre-trained language models~(PTM) by a knowledge-aware graph neural network~(GNN) encoder that leverages commonsense knowledge graphs~(CSKGs).Despite the effectiveness, these approaches are built in heavy architectures, and can't clearly explain how external knowledge resources improve the reasoning capacity of PTMs. Considering this issue, we conduct deep empirical analysis, and find that it is indeed \emph{relation features} from CSKGs (but not \emph{node features}) that mainly contribute to the performance improvement of PTM. Based on this finding, we design a simple MLP-based knowledge encoder by utilizing statistical relation paths as features. Extensive experiments conducted on five benchmarks demonstrate the effectiveness of our approach, which also largely reduces the parameters for encoding CSKGs.
Paper Type: long
0 Replies

Loading