SmartCal: A Novel Automated Approach to Classifier Probability Calibration

Published: 03 Jun 2025, Last Modified: 03 Jun 2025AutoML 2025 Methods TrackEveryoneRevisionsBibTeXCC BY 4.0
Confirmation: our paper adheres to reproducibility best practices. In particular, we confirm that all important details required to reproduce results are described in the paper,, the authors agree to the paper being made available online through OpenReview under a CC-BY 4.0 license (https://creativecommons.org/licenses/by/4.0/), and, the authors have read and commit to adhering to the AutoML 2025 Code of Conduct (https://2025.automl.cc/code-of-conduct/).
Reproducibility: pdf
TL;DR: Automated Machine Learning Framework for Post-hoc Calibration of Classification Algorithms
Abstract: Accurate probability estimates are crucial in classification, yet widely used calibration methods like Platt and temperature scaling fail to generalize across diverse datasets. We introduce SmartCal, an AutoML framework that automatically selects the optimal post-hoc calibration strategy from a pool of 12 methods. Using a large-scale knowledge base of 172 datasets in multiple modalities and 13 classifiers, we show that no single calibrator is universally superior. SmartCal employs a meta-model trained on the meta-features of the calibration splits and classifier output to recommend the best calibration method for new tasks. Additionally, Bayesian optimization refines this selection process, outperforming standard baselines and random search. Experiments demonstrate that SmartCal systematically improves the calibration over existing approaches such as Beta Calibration and Temperature Scaling. This tool is freely available with a unified interface, simplifying the calibration process for researchers and practitioners.
Submission Number: 20
Loading