Flipped Classroom: Effective Teaching for Chaotic Time Series Forecasting

TMLR Paper37 Authors

07 Apr 2022 (modified: 17 Sept 2024)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Gated RNNs like LSTM and GRU are the most common choice for forecasting time series data reaching state of the art performance. Training such sequence-to-sequence RNNs models can be delicate though. While gated RNNs effectively tackle exploding and vanishing gradients, there remains the exposure bias problem provoked by training sequenceto- sequence models with teacher forcing. Exposure bias is a concern in natural language processing (NLP) as well and there are already plenty of studies that propose solutions, the most prominent probably being scheduled sampling. For time series forecasting, though, the most frequent suggestion is training the model in free running mode to stabilize its prediction capabilities over longer horizons. In this paper, we demonstrate that exposure bias is a serious problem even or especially outside of NLP and that training such models free running is only sometimes successful. To fill the gap, we are formalizing curriculum learning (CL) strategies along the training as well as the training iteration scale, we propose several completely new curricula, and systematically evaluate their performance in two experimental sets. We utilize six prominent chaotic dynamical systems for these experiments. We found that the newly proposed increasing training scale curricula with a probabilistic iteration scale curriculum consistently outperforms previous training strategies yielding an NRMSE improvement up to 81% over free running or teacher forced training. For some datasets we additionally observe a reduced number of training iterations and all models trained with the new curricula yield higher prediction stability allowing for longer prediction horizons.
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Jeffrey_Pennington1
Submission Number: 37
Loading