DynTopo: Dynamic Topological Scene Graph for Robotic Autonomy in Human-Centric Environments

ICLR 2026 Conference Submission16555 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dynamic Topological Scene Graph, Scene Representation, Robotic Automation
Abstract: Autonomous operation of service robotics in human-centric scenes remains challenging due to the need for understanding of changing environments and context-aware decision-making. While existing approaches like topological maps offer efficient spatial priors, they fail to model transient object relationships, whereas dense neural representations (e.g., NeRF) incur prohibitive computational costs in updating. At this point, we propose the Dynamic Topological Scene Graph (DynTopo) which introduces dynamic components and relationships into persistent topological layouts for embodied robotic autonomy. Our framework constructs the global topological layouts from posed RGB-D inputs, encoding room-scale connectivity and large static objects (e.g., furniture), while environmental and egocentric cameras populate dynamic information with object position relations and human-object interaction patterns. A holistic unified architecture is conducted by integrating the dynamics into the global topology using semantic and spatial constraints, enabling seamless updates as the environment evolves. An agent powered by large language models (LLMs) is employed to interpret the unified graph, infer latent task triggers, and generate executable instructions grounded in robotic affordances. We conduct complex experiments to demonstrate DynTopo’s superior scene representation effectiveness. Real-world deployments validate the system’s practicality with a mobile manipulator: robotics autonomously complete complex tasks with no further training or complex rewarding in a dynamic scene as cafeteria assistant. See https://anonymous.4open.science/r/DynTopo-80C6 for video demonstration and more details.
Supplementary Material: zip
Primary Area: applications to robotics, autonomy, planning
Submission Number: 16555
Loading