UniFast-HGR: Scalable and Efficient Maximal Correlation for Multimodal Models

ICLR 2026 Conference Submission20073 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: HGR maximal correlation, Soft-HGR, multimodal feature, deep learning, neural network
TL;DR: This paper introduces UniFast HGR, an optimized framework that enhances HGR maximal correlation computation for large-scale neural networks and multimodal learning tasks, offering an efficient solution to deep learning challenges.
Abstract: This paper presents an optimized approach to enhance the computation of Hirschfeld-Gebelein-Rényi (HGR) maximal correlation, addressing computational and efficiency challenges in large-scale neural networks and multimodal learning. The UniFast HGR framework introduces three key innovations: replacing covariance with cosine similarity to eliminate matrix inversion, removing the diagonal of the correlation matrix to mitigate self-correlation bias, and simplifying variance constraints via $\ell_2$-normalization. These contributions reduce computational complexity from $O(K^3)$ to $O(m^2K)$ while improving accuracy and stability. The framework scales effectively across diverse multimodal applications. Additionally, the OptFast variant minimizes normalization steps, achieving efficiency comparable to dot-product operations without sacrificing precision. Experimental evaluations across benchmark datasets validate the framework's ability to balance computational efficiency with accuracy, establishing it as an effective solution for addressing contemporary deep learning challenges.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 20073
Loading