UniFast-HGR: Scalable and Efficient Maximal Correlation for Multimodal Models

19 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: HGR maximal correlation, Soft-HGR, multimodal feature, deep learning, neural network
TL;DR: This paper introduces UniFast HGR, an optimized framework that enhances HGR maximal correlation computation for large-scale neural networks and multimodal learning tasks, offering an efficient solution to deep learning challenges.
Abstract: This paper introduces UniFast-HGR, an efficient and scalable framework for estimating Hirschfeld-Gebelein-Rényi maximal correlation in multimodal learning. The method addresses computational bottlenecks in traditional HGR and Soft-HGR approaches, which suffer from $O(K^3)$ complexity due to covariance matrix inversion and limited scalability to deep architectures. UniFast-HGR incorporates three key innovations: replacing covariance with cosine similarity to avoid matrix inversion, removing diagonal elements to mitigate self-correlation bias, and applying $\ell_2$ normalization as a variance constraint for improved stability. These improvements reduce computational complexity to $O(m^2K)$ while maintaining bounded correlation scores. The OptFast-HGR variant further accelerates computation by simplifying normalization steps, achieving dot-product-level efficiency with minimal accuracy loss. Experimental evaluations across benchmark datasets validate the framework's ability to balance computational efficiency with accuracy, establishing it as an effective solution for addressing contemporary deep learning challenges.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 20073
Loading