Improving Table Retrieval with Question Generation from Partial Tables

Published: 05 Jun 2025, Last Modified: 05 Jun 2025TRL@ACL2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Table Retrieval, Question Generation, Open-Domain Question Answering
TL;DR: We propose QGpT, a lightweight framework that uses simulated questions generated from partial tables to enhance semantic alignment with user queries, significantly improving table retrieval performance without retriever fine-tuning.
Abstract: Recent advances in open-domain question answering over tables have widely adopted large language models (LLMs) under the Retriever-Reader architecture. Prior works have effectively leveraged LLMs to tackle the complex reasoning demands of the Reader component, such as text-to-text, text-to-SQL, and multi-hop reasoning. In contrast, the Retriever component has primarily focused on optimizing the query representation—training retrievers to retrieve relevant tables based on questions, or to select keywords from questions for matching table segments. However, little attention has been given to enhancing how tables themselves are represented in embedding space to better align with questions. To address this, we propose QGpT (Question Generation from Partial Tables), a simple yet effective method that uses an LLM to generate synthetic questions based on small portions of a table. These questions are generated to simulate how a user might query the content of the table currently under consideration. The generated questions are then jointly embedded with the partial table segments used for generation, enhancing semantic alignment with user queries. Without the need to embed entire tables, our method significantly improves retrieval performance across multiple benchmarks for both dense and late-interaction retrievers.
Include In Proceedings: Yes
Submission Number: 28
Loading