Uncertainty Estimation with Recursive Feature Machines

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Uncertainty estimation, kernel methods, gaussian processes, boosting, feature learning
TL;DR: We combine a novel feature learning kernel (RFMs) with Gaussian processes for uncertainty quantification to surpass performance of SOTA methods on tabular datasets.
Abstract: In conventional regression analysis, predictions are typically represented as point estimates derived from covariates. The Gaussian Process (GP) offer a kernel-based framework that predicts and quantifies associated uncertainties. However, kernel-based methods often underperform ensemble-based decision tree approaches in regression tasks involving tabular and categorical data. Recently, Recursive Feature Machines (RFMs) were proposed as a novel feature-learning kernel which strengthens the capabilities of kernel machines. In this study, we harness the power of these RFMs in a probabilistic GP-based approach to enhance uncertainty estimation through feature extraction within kernel methods. We employ this learned kernel for in-depth uncertainty analysis. On tabular datasets, our RFM-based method surpasses other leading uncertainty estimation techniques, including NGBoost and CatBoost-ensemble. Additionally, when assessing out-of-distribution performance, we found that boosting-based methods are surpassed by our RFM-based approach.
List Of Authors: Gedon, Daniel and Abedsoltan, Amirhesam and Schon, Thomas B and Belkin, Mikhail
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/dgedon/rfm_uncertainty
Submission Number: 486
Loading