Keywords: Commonsense Reasoning, Question Answering
TL;DR: We take a middle ground between large language models and knowledge graphs by using smaller language models together with a relatively smaller but targeted natural language text corpora to reason with implicit commonsense.
Abstract: Commonsense Reasoning is a research challenge studied from the early days of AI. In recent years, several natural language QA task have been proposed where commonsense reasoning is important. Two common approaches to this are (i) Use of well-structured commonsense present in knowledge graphs, and (ii) Use of progressively larger transformer language models. While acquiring and representing commonsense in a formal representation is challenging in approach (i), approach (ii) gets more and more resource-intensive. In this work, we take a middle ground where we use smaller language models together with a relatively smaller but targeted natural language text corpora. The advantages of such an approach is that it is less resource intensive and yet at the same time it can use unstructured text corpora. We define different unstructured commonsense knowledge sources, explore three strategies for knowledge incorporation, and propose four methods competitive to state-of-the-art methods to reason with implicit commonsense.
Subject Areas: Question Answering and Reasoning
Archival Status: Archival
Supplementary Material: zip