Adapting Neural Models with Sequential Monte Carlo DropoutDownload PDF

Published: 10 Sept 2022, Last Modified: 05 May 2023CoRL 2022 PosterReaders: Everyone
Keywords: Model Adaptation, Meta-Learning, Online Robot Control and Prediction
TL;DR: We infer dropout masks at run time to adapt a neural model to changing conditions. Masks also capture context-dependen information.
Abstract: The ability to adapt to changing environments and settings is essential for robots acting in dynamic and unstructured environments or working alongside humans with varied abilities or preferences. This work introduces an extremely simple and effective approach to adapting neural models in response to changing settings, without requiring any specialised meta-learning strategies. We first train a standard network using dropout, which is analogous to learning an ensemble of predictive models or distribution over predictions. At run-time, we use a particle filter to maintain a distribution over dropout masks to adapt the neural model to changing settings in an online manner. Experimental results show improved performance in control problems requiring both online and look-ahead prediction, and showcase the interpretability of the inferred masks in a human behaviour modelling task for drone tele-operation.
Student First Author: no
Supplementary Material: zip
14 Replies

Loading