RelationMatch: Matching In-batch Relationships for Semi-supervised Learning

16 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: semi-supervised learning, self-training, matrix cross-entropy
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We Introduce RelationMatch, a semi-supervised learning algorithm that leverages intra-batch relationships via a novel Matrix Cross-Entropy loss, outperforming several established state-of-the-art methods.
Abstract: Semi-supervised learning has gained prominence for its ability to utilize limited labeled data alongside abundant unlabeled data. However, prevailing algorithms often neglect the relationships among data points within a batch, focusing instead on augmentations from identical sources. This paper presents RelationMatch, an innovative semi-supervised learning framework that capitalizes on these relationships through a novel Matrix Cross-Entropy (MCE) loss function. We rigorously derive MCE from both matrix analysis and information geometry perspectives. Our extensive empirical evaluations, including a 15.21% accuracy improvement over FlexMatch on the STL-10 dataset, demonstrate that RelationMatch consistently outperforms existing state-of-the-art methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 523
Loading