Exploring the Role of Reasoning Structures for Constructing Proofs in Multi-Step Natural Language Reasoning with Large Language ModelsDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: When performing complex multi-step reasoning tasks, the ability of Large Language Models (LLMs) to derive structured intermediate proof steps is important for ensuring that the models truly perform the desired reasoning. This paper is centred around a focused study: whether the current state-of-the-art LLMs can leverage the structures in a few examples and benefit from them to construct the proof structures when performing complex natural language reasoning. Our study specifically focuses on structure-aware demonstration and structure-aware pruning. We demonstrate that both of them help improve performance. We provide a detailed analysis to help understand the results.
Paper Type: short
Research Area: Semantics: Sentence-level Semantics, Textual Inference and Other areas
Contribution Types: NLP engineering experiment, Approaches to low-resource settings
Languages Studied: English
0 Replies

Loading