Neural CDEs for Long Time Series via the Log-ODE MethodDownload PDF

28 Sept 2020 (modified: 22 Oct 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: CDE, neural differential equation, time series, long time series, log-ODE
Abstract: Neural Controlled Differential Equations (Neural CDEs) are the continuous-time analogue of an RNN, just as Neural ODEs are analogous to ResNets. However just like RNNs, training Neural CDEs can be difficult for long time series. Here, we propose to apply a technique drawn from stochastic analysis, namely the log-ODE method. Instead of using the original input sequence, our procedure summarises the information over local time intervals via the log-signature map, and uses the resulting shorter stream of log-signatures as the new input. This represents a length/channel trade-off. In doing so we demonstrate efficacy on problems of length up to 17k observations and observe significant training speed-ups, improvements in model performance, and reduced memory requirements compared to the existing algorithm.
One-sentence Summary: We process very long (17k) time series by using a neural CDE with a numerical solver that (a) steps over multiple data points at once, (b) may be interpreted as a binning technique, (c) represents a length/channel tradeoff.
Supplementary Material: zip
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 4 code implementations](https://www.catalyzex.com/paper/arxiv:2009.08295/code)
Reviewed Version (pdf): https://openreview.net/references/pdf?id=UeC5mrH-_A
11 Replies

Loading