Decompose-and-Formalise: Recursively Verifiable Natural Language Inference

ACL ARR 2026 January Submission10142 Authors

06 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Natural Langauge Inference, Logical Reasoning, Theorem Proving, Large Language Models
Abstract: Recent work has shown that integrating large language models (LLMs) with theorem provers (TPs) in neuro-symbolic pipelines helps with autoformalisation, entailment verification, and proof-guided refinement of explanations for natural language inference (NLI). However, scaling such refinement to naturalistic NLI remains difficult: long, syntactically rich inputs and deep multi-step arguments amplify autoformalisation errors, where a single local mismatch can invalidate the proof. Moreover, current methods often handle failures via costly global regeneration due to the difficulty of localising the responsible span or step from prover diagnostics. Aiming to address these problems, we propose a decompose-and-formalise framework that (i) decomposes premise-hypothesis pairs into an entailment tree of atomic steps, (ii) verifies the tree bottom-up to isolate failures to specific nodes, and (iii) performs local diagnostic-guided refinement instead of regenerating the whole explanation. Moreover, to improve faithfulness of autoformalisation, we introduce $\theta$-substitution in an event-based logical form to enforce consistent argument–role bindings. Across a range of reasoning tasks using five LLM backbones, our method achieves the highest verified explanation rates, improving over the state-of-the-art by 26.2\%, 21.7\%, 21.6\% and 48.9\%, while reducing refinement iterations and runtime while preserving strong end-NLI accuracy.
Paper Type: Long
Research Area: Semantics: Lexical, Sentence-level Semantics, Textual Inference and Other areas
Research Area Keywords: textual entailment, natural language inference
Contribution Types: Model analysis & interpretability
Languages Studied: English
Submission Number: 10142
Loading