Tabular Representation, Noisy Operators, and Impacts on Table Structure Understanding Tasks in LLMs

Published: 28 Oct 2023, Last Modified: 27 Nov 2023TRL @ NeurIPS 2023 OralEveryoneRevisionsBibTeX
Keywords: large language models, in-context learning, table structure
Abstract: Large language models (LLMs) are increasingly applied for tabular tasks using in-context learning. The prompt representation for a table may play a role in the LLMs ability to process the table. Inspired by prior work, we generate a collection of self-supervised structural tasks (e.g. navigate to a cell and row; transpose the table) and evaluate the performance differences when using 8 formats. In contrast to past work, we introduce 8 noise operations inspired by real-world messy data and adversarial inputs, and show that such operations can impact LLM performance across formats for different structural understanding tasks.
Slides: pdf
Submission Number: 25
Loading