Non-negative Tensor Low-rank Decompositions Through the Lens of Information Geometry

AAAI 2025 Workshop CoLoRAI Submission4 Authors

19 Nov 2024 (modified: 03 Feb 2025)AAAI 2025 Workshop CoLoRAI SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: em-algorithm, Low-rank approximation, Information geometry
TL;DR: EM-based unified framework for non-negative tensor decomposition inspired by information geometry.
Abstract: Non-negative tensor low-rank decompositions based on the Kullback--Leibler (KL) divergence minimization form non-convex optimization problems with the associated instability being a longstanding issue. This study introduces an information geometric analysis of such decompositions in order to enhance their stability. The key idea behind our analysis is to consider the tensor ranks as hidden variables and employ the EM algorithm and its information geometric view. We reveal that the instability in tensor decomposition arises from hidden variables breaking the flatness of the model manifold --- the set of low-rank tensors. Consequently, we reformulate the tensor low-rank decomposition as iterative projections onto a flat model manifold of tensors without hidden variables, i.e., a set of rank-$1$ tensors, in higher-order tensor space than the original given tensor. This analysis bridges information geometry and tensor decomposition, resulting in a novel algorithm that ensures a monotonic decrease in the KL divergence regardless of the low-rank structure for which we presently consider the CP, Tucker, and Tensor Train decompositions.
Submission Number: 4
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview