Learning Mahalanobis Metric Spaces via Geometric Approximation AlgorithmsDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: Fully parallelizable and adversarial-noise resistant metric learning algorithm with theoretical guarantees.
Abstract: Learning Mahalanobis metric spaces is an important problem that has found numerous applications. Several algorithms have been designed for this problem, including Information Theoretic Metric Learning (ITML) [Davis et al. 2007] and Large Margin Nearest Neighbor (LMNN) classification [Weinberger and Saul 2009]. We consider a formulation of Mahalanobis metric learning as an optimization problem,where the objective is to minimize the number of violated similarity/dissimilarity constraints. We show that for any fixed ambient dimension, there exists a fully polynomial time approximation scheme (FPTAS) with nearly-linear running time.This result is obtained using tools from the theory of linear programming in low dimensions. We also discuss improvements of the algorithm in practice, and present experimental results on synthetic and real-world data sets. Our algorithm is fully parallelizable and performs favorably in the presence of adversarial noise.
Code: https://drive.google.com/drive/folders/1XgABqyh8E1CoRGadh1KC5or7TBLgQkdl
Keywords: Metric Learning, Geometric Algorithms, Approximation Algorithms
Original Pdf: pdf
7 Replies

Loading