Flexible Nonparametric Kernel Learning with Different Loss FunctionsOpen Website

Published: 01 Jan 2013, Last Modified: 23 Sept 2023ICONIP (2) 2013Readers: Everyone
Abstract: Side information is highly useful in the learning of a nonparametric kernel matrix. However, this often leads to an expensive semidefinite program (SDP). In recent years, a number of dedicated solvers have been proposed. Though much better than off-the-shelf SDP solvers, they still cannot scale to large data sets. In this paper, we propose a novel solver based on the alternating direction method of multipliers (ADMM). The key idea is to use a low-rank decomposition of the kernel matrix Z = X  ⊤  Y, with the constraint that X = Y. The resultant optimization problem, though non-convex, has favorable convergence properties and can be efficiently solved without requiring eigen-decomposition in each iteration. Experimental results on a number of real-world data sets demonstrate that the proposed method is as accurate as directly solving the SDP, but can be one to two orders of magnitude faster.
0 Replies

Loading