Keywords: semi-supervised learning, f-divergence
TL;DR: We propose novel self-training approaches (pseudo-labeling and entropy minimization) inspired by f-divergences.
Abstract: This paper investigates a range of empirical risk functions and regularization methods suitable for self-training methods in semi-supervised learning. These approaches draw inspiration from $f$-divergences. In the pseudo-labeling and entropy minimization techniques as self-training methods for effective semi-supervised learning, the self-training process has some inherent mismatch between the true label and pseudo-label (noisy pseudo-labels) and our empirical risk functions are robust with respect to noisy pseudo-labels. Under some conditions, our empirical risk functions demonstrate better performance when compared to traditional self-training methods.
Submission Number: 152
Loading