Low-rank and Smooth Tensor Recovery on Cartesian Product GraphsDownload PDF

Published: 21 May 2023, Last Modified: 12 Sept 2023SampTA 2023 PaperReaders: Everyone
Abstract: Data science research has found great success with algorithms that leverage the structure of the topological space that the high-dimensional data lies on. In particular, low-rank tensor models which represent low-dimensional latent factors in a succinct and parsimonious way have become indispensable tools. These low-rank models have been utilized in a variety of applications including tensor completion from corrupted or missing entries. In the standard tensor completion problem, the different modes of the tensor are assumed to be completely independent of each other. However, in many real-world problems such as those involving spatio-temporal data, there exist relationships between the different modes. This information can be encoded in terms of graphs which can bring additional structure to the tensor completion problem. In this paper, we introduce methods for structured tensor completion where both the low-rank and smoothness of tensor are incorporated into the optimization problem. In particular, we model tensor data as graph signals on Cartesian product graphs and use the Dirichlet energy to quantify the smoothness of tensor data with respect to the graph. We evaluate the performance of this tensor recovery approach for different types of data, i.e. low-rank, smooth and low-rank plus smooth, and compare with existing methods.
Submission Type: Full Paper
Supplementary Materials: pdf
0 Replies

Loading