NTKCPL: Active Learning on Top of Self-Supervised Model by Estimating True Coverage

11 May 2023 (modified: 12 Dec 2023)Submitted to NeurIPS 2023EveryoneRevisionsBibTeX
Keywords: active learning, low budget, neural tangent kernel, pseudo-label
TL;DR: We propose an active learning strategy that performs effective on a broad range of annotation quantities, enabling us to confidently employ AL on top of a self-supervised model.
Abstract: High annotation cost has driven extensive research in active learning and self-supervised learning. Recent research has shown that in the context of supervised learning, when we have different numbers of labels, we need to apply different active learning strategies to ensure that it outperforms the random baseline. This number of annotations that change the suitable active learning strategy is called the phase transition point. We found, however, when combining active learning with self-supervised models to achieve improved performance, the phase transition point occurs earlier. It becomes challenging to determine which strategy should be used for previously unseen datasets. We argue that existing active learning algorithms are heavily influenced by the phase transition because the empirical risk over the entire active learning pool estimated by these algorithms is inaccurate and influenced by the number of labeled samples. To address this issue, we propose a novel active learning strategy, neural tangent kernel clustering-pseudo-labels (NTKCPL). It estimates empirical risk based on pseudo-labels and the model prediction with NTK approximation. We analyze the factors affecting this approximation error and design a pseudo-label clustering generation method to reduce the approximation error. Finally, our method was validated on five datasets, empirically demonstrating that it outperforms the baseline methods in most cases and is valid over a longer range of training budgets.
Supplementary Material: zip
Submission Number: 10203
Loading