NeMF: Neural Motion Fields for Kinematic AnimationDownload PDF

Published: 31 Oct 2022, Last Modified: 10 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Motion Modeling, Implicit Neural Representations, Neural Fields
Abstract: We present an implicit neural representation to learn the spatio-temporal space of kinematic motions. Unlike previous work that represents motion as discrete sequential samples, we propose to express the vast motion space as a continuous function over time, hence the name Neural Motion Fields (NeMF). Specifically, we use a neural network to learn this function for miscellaneous sets of motions, which is designed to be a generative model conditioned on a temporal coordinate $t$ and a random vector $z$ for controlling the style. The model is then trained as a Variational Autoencoder (VAE) with motion encoders to sample the latent space. We train our model with a diverse human motion dataset and quadruped dataset to prove its versatility, and finally deploy it as a generic motion prior to solve task-agnostic problems and show its superiority in different motion generation and editing applications, such as motion interpolation, in-betweening, and re-navigating. More details can be found on our project page: https://cs.yale.edu/homes/che/projects/nemf/.
TL;DR: We present an implicit neural representation to learn the spatio-temporal space of kinematic motions.
Supplementary Material: zip
19 Replies

Loading