Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches

Published: 06 Sept 2023, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Kernel methods are learning algorithms that enjoy solid theoretical foundations while suffering from important computational limitations. Sketching, which consists in looking for solutions among a subspace of reduced dimension, is a well-studied approach to alleviate these computational burdens. However, statistically-accurate sketches, such as the Gaussian one, usually contain few null entries, such that their application to kernel methods and their non-sparse Gram matrices remains slow in practice. In this paper, we show that sparsified Gaussian (and Rademacher) sketches still produce theoretically-valid approximations while allowing for important time and space savings thanks to an efficient \emph{decomposition trick}. To support our method, we derive excess risk bounds for both single and multiple output kernel problems, with generic Lipschitz losses, hereby providing new guarantees for a wide range of applications, from robust regression to multiple quantile regression. Our theoretical results are complemented with experiments showing the empirical superiority of our approach over state-of-the-art sketching methods.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/tamim-el/Fast-Kernel-Methods-for-Generic-Lipschitz-Losses-via-p-Sparsified-Sketches
Supplementary Material: zip
Assigned Action Editor: ~Makoto_Yamada3
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1266
Loading