Improved Learning Rates of a Functional Lasso-type SVM with Sparse Multi-Kernel RepresentationDownload PDF

21 May 2021, 20:43 (modified: 21 Jan 2022, 17:19)NeurIPS 2021 SpotlightReaders: Everyone
Keywords: Learning theory, Lasso-type SVM, Excess risk, Multi-kernel learning
Abstract: In this paper, we provide theoretical results of estimation bounds and excess risk upper bounds for support vector machine (SVM) with sparse multi-kernel representation. These convergence rates for multi-kernel SVM are established by analyzing a Lasso-type regularized learning scheme within composite multi-kernel spaces. It is shown that the oracle rates of convergence of classifiers depend on the complexity of multi-kernels, the sparsity, a Bernstein condition and the sample size, which significantly improves on previous results even for the additive or linear cases. In summary, this paper not only provides unified theoretical results for multi-kernel SVMs, but also enriches the literature on high-dimensional nonparametric classification.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
15 Replies