STCN: Stochastic Temporal Convolutional Networks

Anonymous

Sep 27, 2018 (modified: Nov 17, 2018) ICLR 2019 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Convolutional architectures have recently been shown to be competitive on many sequence modelling tasks when compared to the de-facto standard of recurrent neural networks (RNNs), while providing computational and modeling advantages due to inherent parallelism. However, currently there remains a performance gap to more expressive stochastic RNN variants, especially those with several layers of dependent random variables. In this work, we propose stochastic temporal convolutional networks (STCNs), a novel architecture that combines the computational advantages of temporal convolutional networks (TCN) with the representational power and robustness of stochastic latent spaces. In particular, we propose a hierarchy of stochastic latent variables that captures temporal dependencies at different time-scales. The architecture is modular and flexible due to decoupling of deterministic and stochastic layers. We show that the proposed architecture achieves state of the art log-likelihoods across several tasks. Finally, the model is capable of predicting high-quality synthetic samples over a long-range temporal horizon in a variety of tasks including modeling of handwritten text and digits.
  • Keywords: latent variables, variational inference, temporal convolutional networks, sequence modeling, auto-regressive modeling
  • TL;DR: We combine the computational advantages of temporal convolutional architectures with the expressiveness of stochastic latent variables.
0 Replies

Loading