Abstract: The autoregressive moving average (ARMA) model is a classical, and arguably one
of the most studied approaches to model time series data.
It has compelling theoretical properties and is widely used among practitioners.
More recent deep learning approaches popularize recurrent neural networks (RNNs)
and, in particular, long short-term memory (LSTM) cells that have become one of
the best performing and most common building blocks in neural time series modeling.
While advantageous for time series data or sequences with long-term effects,
complex RNN cells are not always a must and can sometimes even be inferior to
simpler recurrent approaches. In this work, we introduce the ARMA cell,
a simpler, modular and effective approach for time series modeling in neural
networks. This cell can be used in any neural network architecture where
recurrent structures are present and naturally handles multivariate time series
using vector autoregression. We also introduce the ConvARMA cell as a natural
successor for spatially-correlated time series. Our experiments show that the
proposed methodology is competitive with popular alternatives in terms of
performance while being more robust and compelling due to its simplicity.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Sinead_Williamson1
Submission Number: 395
Loading