Learning Efficient Tensor Representations with Ring Structure NetworksDownload PDF

12 Feb 2018, 19:56 (edited 15 Feb 2018)ICLR 2018 Workshop SubmissionReaders: Everyone
  • Keywords: Tensor Decomposition, Tensor Networks, Stochastic Gradient Descent
  • Abstract: \emph{Tensor train (TT) decomposition} is a powerful representation for high-order tensors, which has been successfully applied to various machine learning tasks in recent years. In this paper, we propose a more generalized tensor decomposition with ring structure network by employing circular multilinear products over a sequence of lower-order core tensors, which is termed as TR representation. Several learning algorithms including blockwise ALS with adaptive tensor ranks and SGD with high scalability are presented. Furthermore, the mathematical properties are investigated, which enables us to perform basic algebra operations in a computationally efficiently way by using TR representations. Experimental results on synthetic signals and real-world datasets demonstrate the effectiveness of TR model and the learning algorithms. In particular, we show that the structure information and high-order correlations within a 2D image can be captured efficiently by employing tensorization and TR representation.
1 Reply

Loading