Abstract: We study Sparse Multiple Kernel Learning (SMKL), which is the problem of selecting a sparse convex combination of prespecified kernels for support vector binary classification. Unlike prevailing $\ell_1$‐regularized approaches that approximate a sparsifying penalty, we formulate the problem by imposing an explicit cardinality constraint on the kernel weights and add an $\ell_2$ penalty for robustness. We solve the resulting non-convex minimax problem via an alternating best response algorithm with two subproblems: the $\alpha$‐subproblem is a standard kernel SVM dual solved via LIBSVM, while the $\beta$‐subproblem admits an efficient solution via the Greedy Selector and Simplex Projector algorithm. We reformulate SMKL as a mixed integer semidefinite optimization problem and derive a hierarchy of semidefinite convex relaxations which can be used to certify near-optimality of the solutions returned by our best response algorithm and also to warm start it. On ten UCI benchmarks, our method with random initialization outperforms state-of-the-art MKL approaches in out of sample prediction accuracy on average by $3.34$ percentage points (relative to the best performing benchmark) while selecting a small number of candidate kernels in comparable runtime. With warm starting, our method outperforms the best performing benchmark's out of sample prediction accuracy on average by $4.05$ percentage points. Our convex relaxations provide a certificate that in several cases, the solution returned by our best response algorithm is the globally optimal solution.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: This version represents the camera-ready submission of the paper. All author names have now been added, and the manuscript has been fully deanonymized. The previously anonymous code link has been replaced with the official GitHub repository. Several minor typographical errors have been corrected.
Code: https://github.com/iglesiascaio/SparseMKL
Assigned Action Editor: ~Yunwen_Lei1
Submission Number: 5481
Loading