From Motion to Behavior: Hierarchical Modeling of Humanoid Generative Behavior Control

03 Sept 2025 (modified: 12 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Human motion generation, Long-horizon synthesis, Task planning, Motion planning, Behavior control
TL;DR: We introduce Generative Behavior Control (GBC) and the GBC-100K dataset to generate long-horizon, goal-directed, and coherent humanoid behaviors via LLM planning and physics-informed control.
Abstract: Human motion generative modeling aims to synthesize complex motions from daily activities. However, current research is fragmented, focusing on either low-level, short-horizon motions or high-level, disembodied action planning, thereby neglecting the hierarchical and goal-oriented nature of human activities. This work shifts the research focus from motion generation to the more holistic task of humanoid behavior modeling. To formally address this, we first introduce Generative Behavior Control (GBC), a new task focused on generating long-term, physically plausible, and semantically coherent behaviors from high-level intentions. To tackle this task, we present a novel framework that aligns motion synthesis with hierarchical plans generated by large language models (LLMs), leveraging principles from task and motion planning. Concurrently, to overcome the limitations of existing benchmarks, we introduce the GBC-100K dataset, a large-scale corpus annotated with hierarchical semantic and motion plans. Experimental results demonstrate our framework, trained on GBC-100K, generates more diverse and purposeful human behaviors with up to 10$\times$ longer horizons than existing methods. This work lays a foundation for future research in behavior-centric modeling, with all code and data to be made publicly available.
Supplementary Material: zip
Primary Area: datasets and benchmarks
Submission Number: 1568
Loading