Variational Hyper RNN for Sequence Modeling

Anonymous

Sep 25, 2019 ICLR 2020 Conference Blind Submission readers: everyone Show Bibtex
  • Keywords: variational autoencoder, hypernetwork, recurrent neural network, time series
  • TL;DR: We propose a novel probabilistic sequence model that excels at capturing high variability in time series data using hypernetworks.
  • Abstract: In this work, we propose a novel probabilistic sequence model that excels at capturing high variability in time series data, both across sequences and within an individual sequence. Our method uses temporal latent variables to capture information about the underlying data pattern and dynamically decodes the latent information into modifications of weights of the base decoder and recurrent model. The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data that exhibit large scale variations, regime shifts, and complex dynamics.
0 Replies

Loading