Task-Aligned Attention Retrieval for Scaling Tabular Foundation Models

15 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Retrieval Tabular data, Foundation models
Abstract: Retrieval serves as an effective approach to address various scaling challenges in in-context learning with tabular foundation models, yet prevailing methods select neighbors by Euclidean proximity in the covariates and thus ignore how the task mapping varies across the feature space. We introduce Task-Aligned Attention Retrieval (TAAR), a simple, model-agnostic procedure that, for each query, selects the most predictive features and relevant context samples using the model’s own attention scores. TAAR therefore ranks candidates by a task-aligned similarity already internalized by the foundation model, rather than by raw geometric distance in input features. TAAR is a drop-in module for state-of-the-art tabular foundation models (e.g., TabPFN and LimiX), requires no fine-tuning, and adds only an extra forward pass. On classification and regression benchmarks, TAAR achieves pronounced gains in accuracy and stability over current retrieval methods and supports scaling along feature space, sample size and target-class cardinality.
Primary Area: foundation or frontier models, including LLMs
Submission Number: 5393
Loading