- TL;DR: Tailoring predictions from sequence models (such as LDSs and RNNs) via an explicit latent code.
- Abstract: Dynamical system models (including RNNs) often lack the ability to adapt the sequence generation or prediction to a given context, limiting their real-world application. In this paper we show that hierarchical multi-task dynamical systems (MTDSs) provide direct user control over sequence generation, via use of a latent code z that specifies the customization to the individual data sequence. This enables style transfer, interpolation and morphing within generated sequences. We show the MTDS can improve predictions via latent code interpolation, and avoid the long-term performance degradation of standard RNN approaches.
- Code: https://bitbucket.org/user3036976834/mtds_iclr_codebase/src/master/
- Keywords: Time-series modelling, Dynamical systems, RNNs, Multi-task learning