Split Batch Normalization: Improving Semi-Supervised Learning under Domain ShiftDownload PDF

Mar 20, 2019 (edited Oct 18, 2019)ICLR 2019 Workshop LLD Blind SubmissionReaders: Everyone
  • Keywords: semi-supervised learning, domain shift, image classification, deep neural networks
  • Abstract: Recent work has shown that using unlabeled data in semi-supervised learning is not always beneficial and can even hurt generalization, especially when there is a class mismatch between the unlabeled and labeled examples. We investigate this phenomenon for image classification and many other forms of domain shifts (e.g. salt-and-pepper noise). Our main contribution is showing how to benefit from additional unlabeled data that comes from a shifted distribution in batch-normalized neural networks. We achieve it by simply using separate batch normalization statistics for unlabeled examples. Due to its simplicity, we recommend it as a standard practice.
4 Replies