Deep Gaussian Process State-Space Model for Motion Generation via Stochastic Expectation Propagation
Keywords: Deep GP-SSM, probabilistic model, dimension reduction, motion synthesis, Expectation Propagation
Abstract: Gaussian Processes (GPs) and related unsupervised learning techniques such as Gaussian process dynamical models (GP-DMs) have been very successful in the accurate modeling of high-dimensional data based on limited amounts of training data. Usually these techniques have the disadvantage of a high computational complexity. This makes it difficult to solve the associated learning problems for large data sets, since the related computations, as opposed to neural networks, are not node-local. Combining sparse approximation techniques for GPs and stochastic expectation propagation (SEP), we present a framework for the computationally efficient implementation of deep Gaussian process (state-space) models. We provide implementations of this approach on the GPU as well as on the CPU. We present the first implementation of such deep GP-SSMs and demonstrate the computational efficiency of our GPU implementation.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Probabilistic Methods (eg, variational inference, causal inference, Gaussian processes)
Supplementary Material: zip
1 Reply
Loading