Mutual Information-guided Knowledge Transfer for Open-World Semi-Supervised LearningDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Novel Class Discovery, Open-world Semi-supervised learning, Knowledge Transfer, Mutual Information
Abstract: We tackle the open-world semi-supervised learning problem, aiming to cluster novel classes and classify seen classes in unlabeled data based on labeled data from seen classes. The main challenge is to transfer knowledge contained in seen class data to unseen ones. Previous methods mostly transfer knowledge through sharing representation space. However, they learn the seen and unseen classes classifier in a disjoint manner, neglecting the underlying relation between predictions on the seen and unseen classes. Therefore, the learned representations and classifiers are less effective for clustering unseen classes. In this paper, we propose a novel and general method to transfer knowledge between seen and unseen classes. Our insight is to utilize mutual information to measure the generic statistical dependency between seen and unseen classes in the classifier output space, which couple the learning of classifier and promote transferring knowledge between two data sets. To validate the effectiveness and generalization of our method, we conduct extensive experiments on several benchmarks, including CIFAR10/100, Imagenet100, Oxford-IIIT Pet and FGVC-Aicraft datasets. Our results show that the proposed method outperforms previous SOTA by a significant margin on almost all benchmarks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
5 Replies

Loading