Automatic Character Motion Style Transfer via Autoencoder Generative Model and Spatio-Temporal Correlation Mining
Abstract: The style of motion is essential for virtual characters animation, and it is significant to generate motion style efficiently in computer animation. In this paper, we present an efficient approach to automatically transfer motion style by using autoencoder generative model and spatio-temporal correlation mining, which allows users to transform an input motion into a new style while preserving its original content. To this end, we introduce a history vector of previous motion frames into autoencoder generative network, and extract the spatio-temporal feature of input motion. Accordingly, the spatio-temporal correlation within motions can be represented by the correlated hidden units in this network. Subsequently, we established the constraints of Gram matrix in such feature space to produce transferred motion by pre-trained generative model. As a result, various motions of particular semantic can be automatically transferred from one style to another one, and the extensive experiments have shown its outstanding performance.
Loading