Learning to reproduce stochastic time series using stochastic LSTMDownload PDFOpen Website

Published: 01 Jan 2017, Last Modified: 05 Nov 2023IJCNN 2017Readers: Everyone
Abstract: Recurrent neural networks (RNNs) have been widely used for complex data modeling. However, when it comes to long-term time-dependent complex sequential data modeling with stochasticities, RNNs seem to fail because of vanishing gradients problem. Hence, in this paper, we propose a new architecture, stochastic long short term memory (S-LSTM), along with its forward and backward dynamics equations. S-LSTM models stochasticities using Bayesian brain hypothesis, which is a probabilistic model that makes predictions against which samples are tested to update the conclusions about their causes. This is the same as minimizing the difference between inference and posterior densities for suppressing the free energy. During training of S-LSTM, it predicts the mean as well as variance at each time step. The prediction error is minimized by the predicted variance which acts as an inverse weighting factor for prediction error and tries to optimize the maximum likelihood. Our proposed model is evaluated through numerical experiments on noisy Lissajous curves. In the experiments, S-LSTM is found to predict and preserve more stochasticities in the noisy Lissajous curves as compared to LSTM.
0 Replies

Loading