Keywords: calibration, local-calibration, representation-learning
TL;DR: We introduce a vector-quantized, region-aware extension of Dirichlet Calibration that achieves strong local calibration results while preserving global calibration and predictive performance through a parameter-efficient design.
Abstract: Accurate and well-calibrated Machine Learning (ML) models are mandatory in high-stakes settings, yet effective calibration remains challenging: global approaches assume calibration errors are homogeneous across the space, while local methods often rely on latent-space dimensionality reduction, which leads to information loss. We address these issues by injecting a notion of locality in Dirichlet Calibration via Vector Quantization (VQ). We further introduce an efficient parametrization of the Dirichlet concentrations that prevents a combinatorial explosion of calibration parameters.
Our approach allows us to learn heterogeneous calibration maps that generalize well even to sparse regions of the latent space. Experiments on benchmark datasets show significant improvements in local calibration while preserving global calibration and predictive performance.
Submission Number: 16
Loading