A New Tensor Network: Tubal Tensor Train Network and its Applications

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Higher-order tensor decomposition models, Tensor Train decomposition, tubal product, T-SVD, Tubal Tensor Train, tensor completion, tensor compression
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: This paper introduces the Tubal Tensor Train (TTT) decomposition, a novel tensor decomposition model that effectively mitigates the curse of dimensionality inherent in the Tensor Singular Value Decomposition (T-SVD). The TTT decomposition represents an $N$-order tensor as the Tubal product (T-product) of a series of two third-order and $(N-3)$ fourth-order core tensors, contracted. Similar to the Tensor-Train (TT) decomposition, our approach addresses the curse of dimensionality problem. In order to decompose a given tensor into the TTT format, we propose two high-performing algorithms. Numerical simulations are conducted on diverse tasks to demonstrate the efficiency and accuracy of these algorithms compared to the State-of-the-Art methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2531
Loading