ReChoreoNet: Repertoire-based Dance Re-choreography with Music-conditioned Temporal and Style Clues

Published: 01 Jan 2024, Last Modified: 15 Nov 2024Mach. Intell. Res. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: To generate dance that temporally and aesthetically matches the music is a challenging problem in three aspects. First, the generated motion should be beats-aligned to the local musical features. Second, the global aesthetic style should be matched between motion and music. And third, the generated motion should be diverse and non-self-repeating. To address these challenges, we propose ReChoreoNet, which re-choreographs high-quality dance motion for a given piece of music. A data-driven learning strategy is proposed to efficiently correlate the temporal connections between music and motion in a progressively learned cross-modality embedding space. The beats-aligned content motion will be subsequently used as autoregressive context and control signal to control a normalizing-flow model, which transfers the style of a prototype motion to the final generated dance. In addition, we present an aesthetically labelled music-dance repertoire (MDR) for both efficient learning of the cross-modality embedding, and understanding of the aesthetic connections between music and motion. We demonstrate that our repertoire-based framework is robustly extensible in both content and style. Both quantitative and qualitative experiments have been carried out to validate the efficiency of our proposed model.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview