Do Numbers Speak? Questioning Quantity for Enhanced Fact-Checking

ACL ARR 2025 February Submission7142 Authors

16 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Despite advancements in automated fact-checking, a notable gap remains in verifying complex claims, particularly those involving numerical data. This underscores the necessity for fact-checking systems that focus on accurately assessing quantitative claims. To address this critical issue, we introduce QLAIM, a pioneering multi-domain dataset focused exclusively on quantitative claims. It includes 33k fact-checked claims featuring various quantitative information, including comparative, statistical, interval, and temporal, accompanied by detailed metadata and supporting evidence. In conjunction with QLAIM, we present Q2FC, a comprehensive fact-checking framework designed to replicate the investigative rigour of human fact-checkers. Our approach employs controlled question generation to create precise queries that guide the verification process and retrieve relevant responses. This enhances the explanatory power of our model while ensuring data efficiency through clear, human-like inquiries. Empirical evaluations show that our framework significantly outperforms recent fact-checking baselines.
Paper Type: Long
Research Area: Computational Social Science and Cultural Analytics
Research Area Keywords: fact-checking, information retrieval, social media text
Contribution Types: NLP engineering experiment, Publicly available software and/or pre-trained models, Data resources, Data analysis
Languages Studied: English
Submission Number: 7142
Loading