Keywords: tabular classification, tabular in-context learning transformers, architecture
TL;DR: We introduce a new architecture for tabular in-context learning transformers that boosts the classification accuracy.
Abstract:
Tabular In-Context Learning (ICL) transformers, such as TabPFN and TabForestPFN, have shown strong performance on tabular classification tasks. In this paper, we introduce Attic, a new architecture for ICL-transformers. Unlike TabPFN and TabForestPFN, where one token represents all features of one observation, Attic assigns one token to each feature of every observation. This simple architectural change results in a significant performance boost. As a result, we can confidently say that neural networks outperform tree-based methods like XGBoost.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3842
Loading