Temporal Attention and Stacked LSTMs for Multivariate Time Series Prediction

Tryambak Gangopadhyay, Sin Yong Tan, Genyi Huang, Soumik Sarkar

Dec 01, 2018 NIPS 2018 Workshop Spatiotemporal Blind Submission readers: everyone
  • Abstract: Temporal attention mechanism has been applied to get state-of-the-art results in neural machine translation. LSTMs can capture the long-term temporal dependencies in a multivariate time series. We use temporal attention mechanism on top of stacked LSTMs demonstrating the performance on a multivariate time-series dataset for predicting pollution. Using attention to soft search for relevant parts of the input, our proposed model outperforms the encoder-decoder model version (using only stacked LSTMs) in most cases. In our approach, the soft alignments highlight the important time-steps that are most relevant in predicting pollution for future time steps.
0 Replies