Learning with Few-Shot Complementary LabelsDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Complementary-label learning, Few-shot learning
Abstract: Complementary-label (CL) learning deals with the weak supervision scenario where each training instance is associated with one complementary label, which specifies the class label that the instance does not belong to. Since these CL algorithms rely on the assumption of a large amount of labeled/unlabeled training data, they cannot be applied in few-shot scenarios and perform well. To bridge the gap, we propose a Few-shot Complementary-Label (FsCL) training pattern with three kinds of surrogate loss, which is based on the Model-Agnostic Meta-Learning (MAML) and bilevel optimization. FsCL firstly modifies the inductive bias of the meta-learner apart from the misleading of complementary label and insufficient sample diversity in the outer loop. Next, the inner loop is used to solve the target FsCL classification problem with base learner initialized from meta-learner. Accordingly, an unseen example can be precisely classified via the maximize probability output of base learner. We demonstrate the effectiveness of our approach in an extensive empirical study and theoretical analysis.
One-sentence Summary: Our paper proposes complementary-label learning in few-shot scenarios.
Supplementary Material: zip
1 Reply

Loading