Adapting Neural Models with Sequential Monte Carlo DropoutDownload PDF

16 Jun 2022, 10:45 (modified: 15 Nov 2022, 02:54)CoRL 2022 PosterReaders: Everyone
Student First Author: no
Keywords: Model Adaptation, Meta-Learning, Online Robot Control and Prediction
TL;DR: We infer dropout masks at run time to adapt a neural model to changing conditions. Masks also capture context-dependen information.
Abstract: The ability to adapt to changing environments and settings is essential for robots acting in dynamic and unstructured environments or working alongside humans with varied abilities or preferences. This work introduces an extremely simple and effective approach to adapting neural models in response to changing settings, without requiring any specialised meta-learning strategies. We first train a standard network using dropout, which is analogous to learning an ensemble of predictive models or distribution over predictions. At run-time, we use a particle filter to maintain a distribution over dropout masks to adapt the neural model to changing settings in an online manner. Experimental results show improved performance in control problems requiring both online and look-ahead prediction, and showcase the interpretability of the inferred masks in a human behaviour modelling task for drone tele-operation.
Supplementary Material: zip
14 Replies