Abstract: In this paper we address the problem of modelling time series with irregular intervals by incorporating a continuous-time version of the Kalman filter into a neural network architecture. Building on the idea of Recurrent Kalman Networks (RKNs) we use an encoder-decoder structure to learn a latent observation space and latent state space in which the dynamics of the data can be approximated linearly. Here, a recurrent Kalman component alternates between continuous latent state propagation and Bayesian updates from incoming observations. This allows us to model and react instantaneously to observations as they come at arbitrary time steps while ensuring sufficient expressive power to model nonlinear dynamics. Experiments on synthetic data show that the model is indeed able to capture continuous, nonlinear dynamics.
0 Replies
Loading