Optimizing Diffusion Noise Can Serve As Universal Motion Priors

Published: 01 Jan 2024, Last Modified: 20 Aug 2025CVPR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We propose Diffusion Noise Optimization (DNO), a new method that effectively leverages existing motion diffusion models as motion priors for a wide range of motion-related tasks. Instead of training a task-specific diffusion model for each new task, DNO operates by optimizing the diffusion latent noise of an existing pre-trained text-to-motion model. Given the corresponding latent noise of a human motion, it propagates the gradient from the target criteria defined on the motion space through the whole denoising process to update the diffusion latent noise. As a result, DNO supports any use cases where criteria can be defined as a function of motion. In particular, we show that, for motion editing and control, DNO outperforms existing meth-ods in both achieving the objective and preserving the motion content. DNO accommodates a diverse range of editing modes, including changing trajectory, pose, joint lo-cations, or avoiding newly added obstacles. In addition, DNO is effective in motion denoising and completion, pro-ducing smooth and realistic motion from noisy and partial inputs. DNO achieves these results at inference time with-out the need for model retraining, offering great versatility for any defined reward or loss function on the motion rep-resentation.
Loading