Dynamic Regrasping with Asynchronous Vision Feedback using a Minimalist Robotic System

Published: 22 May 2025, Last Modified: 22 May 2025RoboLetics 2.0 ICRA 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dynamic regrasping, dynamic manipulation, computed torque control
Abstract: Robotic manipulation has advanced significantly in quasi-static tasks like pick-and-place, peg-in-hole, and object reorientation, yet dynamic manipulation, exploiting motion-driven forces such as inertia and momentum, remains a critical challenge. While humans effortlessly perform dynamic regrasping by tossing and catching objects mid-air, robots typically require complex hardware (e.g., dexterous hands, bimanual setups) or computationally intensive planning with extrinsic contacts. Prior dynamic regrasping methods rely on specialized hardware, such as high-speed vision systems or multi-fingered hands, limiting their practicality. This work introduces a minimalist framework for dynamic regrasping using a single robotic arm, a Our approach decomposes the task into two phases: (1) a throwing phase, where the object is propelled into a ballistic trajectory, and (2) a catching phase, where computed torque control enables the gripper to dynamically regrasp the object mid-freefall. To accurately throw the object into the expected trajectory, we refine the throwing policy iteratively with asynchronous vision feedback. By integrating motion planning, computed torque control, and asynchronous visual tracking, we achieve dynamic regrasping without high-speed vision and expensive robot hardware. We show some preliminary experiments to show the efficacy of the method as well as failure cases.
Submission Number: 10
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview