Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor ProjectionsDownload PDF

Published: 12 Jan 2021, Last Modified: 22 Oct 2023ICLR 2021 PosterReaders: Everyone
Keywords: time series, sequential data, representation learning, low-rank tensors, classification, generative modelling
Abstract: Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object -- the free algebra -- to capture this non-commutativity. To address the innate computational complexity of this algebra, we use compositions of low-rank tensor projections. This yields modular and scalable building blocks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification, mortality prediction and generative models for video.
One-sentence Summary: An Efficient Representation of Sequences by Low-Rank Tensor Projections
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Supplementary Material: zip
Code: [![github](/images/github_icon.svg) tgcsaba/seq2tens](https://github.com/tgcsaba/seq2tens)
Data: [PhysioNet Challenge 2012](https://paperswithcode.com/dataset/physionet-challenge-2012), [SHAPES](https://paperswithcode.com/dataset/shapes-1), [Sprites](https://paperswithcode.com/dataset/sprites)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2006.07027/code)
12 Replies

Loading