Optimized user-guided motion control of modular robots

Anastasia Bolotnikova, Kevin Holdcroft, Henry Cerbone, Christoph Belke, Auke Ijspeert, Jamie Paik

Published: 30 Sept 2025, Last Modified: 05 Nov 2025Nature CommunicationsEveryoneRevisionsCC BY-SA 4.0
Abstract: Transferring motion instructions from a user enables robots to perform new and unforeseen operations. Robot collectives, in particular, offer greater adaptability to changing tasks and environments. However, effectively transferring motion instructions becomes challenging as the collective’s shape and size evolve. These changes often require additional system constraints to maintain robust motion control, which typically depends on pre-programmed knowledge of new tasks, ultimately limiting the collective’s adaptability. To overcome the above challenges, we propose a physical and computational platform for user-guided control of self-reconfigurable modular robots. This platform consists of an optimization scheme for online processing of user commands, which prevents any modular robot actions that would violate system or environment constraints. The second component consists of Joint-space Joysticks, which match the robot’s morphology, enabling the user to control diverse and dynamically changing modular robot structures through direct physical interaction. We present a platform that enables users to safely control modular, shape-changing robots through a physical interface. We demonstrate the platform’s efficacy and generalizability across a diverse set of modular robot morphologies using two independent robotic systems — Mori3 and Roombots — performing a range of tasks including pick-and-place, human assistance, legged locomotion, and workspace expansion. The combination of modular tangible interfaces and constrained optimization control enables users to operate shape-changing robots by matching their morphology and ensuring safety, allowing adaptation to different environments and application areas.
Loading