Submission Type: Short paper (4 pages)
Keywords: tabular deep learning, graph neural network, feature interactions
TL;DR: Predictive graph-based tabular deep learning improves when the correct graph structure is learned; current methods fail to do so.
Abstract: Tabular data is characterized by complex, dataset-specific feature interactions. Graph-based tabular deep learning (GTDL) methods aim to address this by representing features and their interactions as a graph. However, existing methods predominantly optimize predictive accuracy, neglecting accurate modeling of the graph structure. In this work, we argue that GTDL should move beyond prediction-centric objectives and prioritize the explicit learning and evaluation of feature interactions. Using synthetic datasets with known ground-truth graph structures, we show that existing GTDL methods fail to recover meaningful feature interactions. Moreover, enforcing the true interaction structure improves predictive performance. This highlights the need for GTDL methods to prioritize quantitative evaluation and accurate structural learning.
Submission Number: 26
Loading