Customizing Sequence Generation with Multi-Task Dynamical SystemsDownload PDF

25 Sept 2019 (modified: 22 Oct 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: Tailoring predictions from sequence models (such as LDSs and RNNs) via an explicit latent code.
Abstract: Dynamical system models (including RNNs) often lack the ability to adapt the sequence generation or prediction to a given context, limiting their real-world application. In this paper we show that hierarchical multi-task dynamical systems (MTDSs) provide direct user control over sequence generation, via use of a latent code z that specifies the customization to the individual data sequence. This enables style transfer, interpolation and morphing within generated sequences. We show the MTDS can improve predictions via latent code interpolation, and avoid the long-term performance degradation of standard RNN approaches.
Code: https://bitbucket.org/user3036976834/mtds_iclr_codebase/src/master/
Keywords: Time-series modelling, Dynamical systems, RNNs, Multi-task learning
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:1910.05026/code)
Original Pdf: pdf
8 Replies

Loading