CCIL: Continuity-Based Data Augmentation for Corrective Imitation Learning

Published: 05 Nov 2023, Last Modified: 30 Oct 2023OOD Workshop @ CoRL 2023EveryoneRevisionsBibTeX
Keywords: imitation learning, model-based, data augmentation, robustness
TL;DR: We generate corrective labels for imitation learning by leveraging the presence of local continuity in the dynamics function.
Abstract: We present a new technique to enhance the robustness of imitation learning methods by generating corrective data to account for compounding error and disturbances. While existing methods rely on interactive expert labeling, additional offline datasets, or domain-specific invariances, our approach requires minimal additional assumptions beyond access to expert data. The key insight is to leverage local continuity in the environment dynamics to generate corrective labels. Our method first constructs a dynamics model from the expert demonstration, enforcing local Lipschitz continuity in the learned model. In locally continuous regions, this model allows us to generate corrective labels within the neighborhood of the demonstrations but beyond the actual set of states and actions in the dataset. Training on this augmented data enhances the agent's ability to recover from perturbations and deal with compounding error. We demonstrate the effectiveness of our generated labels through experiments over a variety of robotics domains.
Submission Number: 10
Loading