An L1 Representer Theorem for Multiple-Kernel Regression

Published: 01 Jan 2018, Last Modified: 13 Nov 2024CoRR 2018EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In this paper, we provide a Banach-space formulation of supervised learning with generalized total-variation (gTV) regularization. We identify the class of kernel functions that are admissible in this framework. Then, we propose a variation of supervised learning in a continuous-domain hybrid search space with gTV regularization. We show that the solution admits a multi-kernel expansion with adaptive positions. In this representation, the number of active kernels is upper-bounded by the number of data points while the gTV regularization imposes an $\ell_1$ penalty on the kernel coefficients. Finally, we illustrate numerically the outcome of our theory.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview