FedCCA: Federated Canonical Correlation Analysis

Published: 03 Feb 2026, Last Modified: 03 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This work provides a privacy-preserving FedCCA framework that replaces costly matrix inversions with efficient truncated series, achieving centralized-level accuracy with faster convergence and strong differential privacy guarantees.
Abstract: Canonical Correlation Analysis (CCA) is a key tool for cross-modal learning, but centralized solutions are impractical due to the heavy cost of high-dimensional covariance operations and the privacy sensitivity of distributed data. To address these challenges, we propose FedCCA, a federated framework that replaces explicit inverses and inner least-squares solves with a truncated von Neumann series, reducing matrix inversions to lightweight matrix–vector multiplications while retaining provable convergence. This series formulation not only improves efficiency, but also provides explicit and tunable control of truncation error, and its structure naturally splits into client-side multiplications and a server-side projection step, making it particularly suitable for federated deployment. Building on this foundation, we incorporate Gaussian differential privacy and derive practical upper and lower bounds on the required noise variance, which yield end-to-end $(\varepsilon,\delta)$ guarantees together with convergence stability. Empirical results on five datasets confirm that FedCCA achieves accuracy comparable to centralized CCA and consistently outperforms ALS/TALS baselines in both sub-optimality gap and convergence speed, all while maintaining rigorous privacy protection.
Submission Number: 203
Loading