Generative Simulation for Dexterous Hands

01 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Robotics;Embodied AI;Generative Simulation;Foundation Model;
TL;DR: GenDexHand is a generative pipeline for dexterous hand tasks that combines LLM/VLM refinement, motion planning, DoF constraints, and subtask decomposition to generate scalable, high-quality trajectory data.
Abstract: Data scarcity remains a fundamental bottleneck for embodied intelligence. Existing approaches use large language models (LLMs) to automate gripper‑based simulation generation, but they transfer poorly to dexterous manipulation, which demands more specialized environment design. Meanwhile, dexterous manipulation tasks are inherently more difficult due to their higher degrees of freedom. Massively generating feasible and trainable dexterous hand tasks remains an open challenge. To this end, we present **GenDexHand**, a *generative simulation pipeline* that autonomously produces diverse robotic tasks and environments for dexterous manipulation. **GenDexHand** introduces a closed‑loop refinement process that adjusts object placements and scales based on vision‑language model (VLM) feedback, substantially improving the average quality of generated environments. Each task is further decomposed into sub‑tasks to enable sequential reinforcement learning, reducing training time and increasing success rates. Our work provides a viable path toward scalable training of diverse dexterous hand behaviors in embodied intelligence by offering a simulation-based solution to synthetic data generation. Our anonymous website: https://sites.google.com/view/gendexhand.
Primary Area: applications to robotics, autonomy, planning
Submission Number: 574
Loading