Multi-Class Classification from Single-Class Data with ConfidencesDownload PDF

29 Sept 2021 (modified: 22 Oct 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Weakly supervised learning, unbiased risk estimator, empirical risk minimization
Abstract: Can we learn a multi-class classifier from only \emph{data from a single class}? We show that without any assumptions on the loss functions, models, and optimizers, we can successfully learn a multi-class classifier from only data from a single class with a rigorous consistency guarantee when \emph{confidences} (i.e., the class-posterior probabilities) are available. Specifically, we propose an empirical risk minimization framework that is loss-/model-/optimizer-independent. Instead of constructing a boundary between the given class and all the other classes, our method can conduct discriminative classification between all the classes even if no data from the other classes are given. We further theoretically and experimentally show that our method can be Bayes-consistent with a simple modification even if the provided confidences are highly noisy. Then, we provide an extension of our method for the case where data from a subset of all the classes are available. Experimental results demonstrate the effectiveness of our methods.
One-sentence Summary: We show that provably consistent multi-class classification can be conducted with only data from a single class and their confidences even if the confidences are extremely noisy.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2106.08864/code)
5 Replies

Loading