Generalization bounds for Kernel Canonical Correlation Analysis

Published: 30 Mar 2023, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We study the problem of multiview representation learning using kernel canonical correlation analysis (KCCA) and establish non-asymptotic bounds on generalization error for regularized empirical risk minimization. In particular, we give fine-grained high-probability bounds on generalization error ranging from $O(n^{-1/6})$ to $O(n^{-1/5})$ depending on underlying distributional properties, where $n$ is the number of data samples. For the special case of finite-dimensional Hilbert spaces (such as linear CCA), our rates improve, ranging from $O(n^{-1/2})$ to $O(n^{-1})$. Finally, our results generalize to the problem of functional canonical correlation analysis over abstract Hilbert spaces.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Minor polishing of the manuscript.
Assigned Action Editor: ~Sivan_Sabato1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 555
Loading