Bridging Diversity and Uncertainty in Active learning with Self-Supervised Pre-Training

Published: 05 Mar 2024, Last Modified: 12 May 2024PML4LRS PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: active learning
TL;DR: We propopse a new robust active learning strategy that shows consistently strong performance over many settings and datasets
Abstract: This study addresses the integration of diversity-based and uncertainty-based sampling strategies in active learning, particularly within the context of self-supervised pre-trained models. We introduce a straightforward heuristic called TCM that enhances the efficiency of active learning by mitigating the cold start problem. By initially applying TypiClust for diversity sampling and subsequently transitioning to uncertainty sampling with Margin, our approach effectively combines the strengths of both strategies. Our experiments demonstrate that TCM consistently outperforms existing methods across various datasets in both low and high data regimes. This work provides a simple yet effective guideline for leveraging active learning in practical applications, making it more accessible and efficient for practitioners.
Submission Number: 29
Loading