Bayesian Metric Learning for Uncertainty Quantification in Image Retrieval

Published: 21 Sept 2023, Last Modified: 14 Nov 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Laplace approximation, metric learning, uncertainty quantification, weight posterior, bayesian
TL;DR: Laplace approximation for metric learning
Abstract: We propose a Bayesian encoder for metric learning. Rather than relying on neural amortization as done in prior works, we learn a distribution over the network weights with the Laplace Approximation. We first prove that the contrastive loss is a negative log-likelihood on the spherical space. We propose three methods that ensure a positive definite covariance matrix. Lastly, we present a novel decomposition of the Generalized Gauss-Newton approximation. Empirically, we show that our Laplacian Metric Learner (LAM) yields well-calibrated uncertainties, reliably detects out-of-distribution examples, and has state-of-the-art predictive performance.
Supplementary Material: pdf
Submission Number: 9243
Loading