Probabilistic Pretraining for Improved Neural Regression

TMLR Paper6084 Authors

03 Oct 2025 (modified: 04 Dec 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: While transfer learning has revolutionized computer vision and natural language processing, its application to probabilistic regression remains underexplored, particularly for tabular data. We introduce NIAQUE (Neural Interpretable Any-Quantile Estimation), a novel permutation-invariant architecture that enables effective transfer learning across diverse regression tasks. Through extensive experiments on 101 datasets, we demonstrate that pre-training NIAQUE on multiple datasets and fine-tuning on target datasets consistently outperforms both traditional tree-based models and transformer-based neural baseline. On real-world Kaggle competitions, NIAQUE achieves competitive performance against heavily hand-crafted and feature-engineered solutions and outperforms strong baselines such as TabPFN and TabDPT, while maintaining interpretability through its probabilistic framework. Our results establish NIAQUE as a robust and scalable approach for tabular regression, effectively bridging the gap between traditional methods and modern transfer learning.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Andres_R_Masegosa1
Submission Number: 6084
Loading