Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor ProjectionsDownload PDF

Sep 28, 2020 (edited Mar 27, 2021)ICLR 2021 PosterReaders: Everyone
  • Keywords: time series, sequential data, representation learning, low-rank tensors, classification, generative modelling
  • Abstract: Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object -- the free algebra -- to capture this non-commutativity. To address the innate computational complexity of this algebra, we use compositions of low-rank tensor projections. This yields modular and scalable building blocks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification, mortality prediction and generative models for video.
  • One-sentence Summary: An Efficient Representation of Sequences by Low-Rank Tensor Projections
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
  • Supplementary Material: zip
12 Replies