Reasoning with Transformer-based Models: Deep Learning, but Shallow ReasoningDownload PDF

22 Jun 2021, 20:08 (modified: 14 Sept 2021, 16:20)AKBC 2021Readers: Everyone
Keywords: Logical Reasoning, Mathematical Reasoning, Commonsense Reasoning, Transformers, BERT
Abstract: Recent years have seen impressive performance of transformer-based models on different natural language processing tasks. However, it is not clear to what degree the transformers can reason on natural language. To shed light on this question, this survey paper discusses the performance of transformers on different reasoning tasks, including mathematical reasoning, commonsense reasoning, and logical reasoning. We point out successes and limitations, of both empirical and theoretical nature.
Subject Areas: Question Answering and Reasoning
Archival Status: Archival
Supplementary Material: zip
10 Replies