Fleet Active Learning: A Submodular Maximization ApproachDownload PDF

Published: 30 Aug 2023, Last Modified: 16 Oct 2023CoRL 2023 PosterReaders: Everyone
Keywords: Active Learning, Cloud Robotics, Robotic Perception
TL;DR: This work introduces a fleet active learning framework for multi-robot systems.
Abstract: In multi-robot systems, robots often gather data to improve the performance of their deep neural networks (DNNs) for perception and planning. Ideally, these robots should select the most informative samples from their local data distributions by employing active learning approaches. However, when the data collection is distributed among multiple robots, redundancy becomes an issue as different robots may select similar data points. To overcome this challenge, we propose a fleet active learning (FAL) framework in which robots collectively select informative data samples to enhance their DNN models. Our framework leverages submodular maximization techniques to prioritize the selection of samples with high information gain. Through an iterative algorithm, the robots coordinate their efforts to collectively select the most valuable samples while minimizing communication between robots. We provide a theoretical analysis of the performance of our proposed framework and show that it is able to approximate the NP-hard optimal solution. We demonstrate the effectiveness of our framework through experiments on real-world perception and classification datasets, which include autonomous driving datasets such as Berkeley DeepDrive. Our results show an improvement by up to $25.0 \%$ in classification accuracy, $9.2 \%$ in mean average precision and $48.5 \%$ in the submodular objective value compared to a completely distributed baseline.
Student First Author: yes
Supplementary Material: zip
Instructions: I have read the instructions for authors (https://corl2023.org/instructions-for-authors/)
Code: https://github.com/UTAustin-SwarmLab/Fleet-Active-Learning.git
Publication Agreement: pdf
Poster Spotlight Video: mp4
16 Replies