Sparse Attention for Tabular QA: A Must-Have for Robust Table Encoding

Published: 21 Feb 2025, Last Modified: 21 Feb 2025RLGMSD 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Tabular Data, Sparse Attention, Generalization
TL;DR: We show that sparse attention improves generalization in tabular QA.
Abstract: The structured nature of tabular data poses significant challenges for deep learning models, which lose structural information when converting tables into linear sequences. While prior work has proposed methods to preserve structure, these architectures keep suffering from generalization issues. In this study, we investigate the impact of encoding techniques on generalization. Ours results demonstrate that sparse attention mechanisms, by focusing on key table components during encoding, significantly enhance model’s structural understanding.
Submission Number: 7
Loading