Metric Learning in an RKHS

Published: 07 May 2025, Last Modified: 13 Jun 2025UAI 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: metric learning, distance learning, distance comparisons
TL;DR: This paper investigates metric learning in an RKHS based on a set of random triplet comparisons in the form of ”Do you think item h is more similar to item i or item j? ” indicating similarity and differences between various items.
Abstract: This paper investigates metric learning in a Reproducing Kernel Hilbert Space (RKHS) based on a set of random triplet comparisons in the form of *"Do you think item h is more similar to item i or item j?"* indicating similarity and differences between various items. The goal is to learn a metric in the RKHS that reflects the comparisons. Nonlinear metric learning using kernel methods and neural networks has shown great empirical promise. While previous works have addressed certain aspects of this problem, there is little or no theoretical understanding of such methods. The exception is the special (linear) case in which the RKHS is the standard $d$-dimensional Euclidean space; there is a comprehensive theory for metric learning in the $d$-dimensional Euclidean space. This paper develops a general RKHS framework for metric learning and provides novel generalization guarantees and sample complexity bounds. We validate our findings through a set of simulations and experiments on real datasets. Our code is publicly available at https://github.com/RamyaLab/metric-learning-RKHS.
Latex Source Code: zip
Code Link: https://github.com/RamyaLab/metric-learning-RKHS
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission277/Authors, auai.org/UAI/2025/Conference/Submission277/Reproducibility_Reviewers
Submission Number: 277
Loading