Low-Rank Density Matrices as Concept Bottlenecks for Tabular Classification

07 May 2026 (modified: 09 May 2026)ICML 2026 Workshop CoLoRAI SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: concept bottleneck models, tabular classification, density matrices, low-rank representations, interpretable machine learning
TL;DR: Concept bottleneck models work for tabular data, and replacing cosine-prototype concepts with low-rank density matrices makes them the best non-foundation method on CC18.
Abstract: Tabular prediction is largely dominated by tree-based methods and, more recently, foundation models, leaving alternative modeling paradigms underexplored. Concept Bottleneck Models (CBMs), which route predictions through intermediate concept representations, are one such direction. Prior work on tabular data (TabCBM) showed competitive performance, but has seen limited follow-up. This work revisits CBMs for tabular learning. We introduce a modern implementation of TabCBM and show that it stays competitive with strong baselines. We then propose \textbf{QuantumBind} (QB), a novel concept layer that represents concepts as low-rank density matrices interacting through learned positive semi-definite observables. Sequential evaluation of these interactions induces structured nonlinearities beyond standard concept scoring mechanisms.\\ On OpenML-CC18, QB achieves 88.7\% mean accuracy, outperforming TabCBM and strong non-foundation baselines including TabM (88.3\%), CatBoost (87.5\%), and XGBoost (87.3\%). We find that multiple small matrices outperform a single large one, and that the benefits of sequential evaluation are dataset-dependent. A unique property of QB is that its receptor observables are feature-count invariant: trained on one set of datasets, they transfer to new tasks with different numbers of features: a form of cross-dataset concept transfer not available to other tabular methods. These results position CBMs as a competitive alternative for modern tabular learning, with QB providing a strong extension.
Submission Number: 78
Loading