Margin based Self-Supervised Neural Architecture Search

TMLR Paper352 Authors

10 Aug 2022 (modified: 17 Sept 2024)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Neural Architecture Search (NAS) has been used recently to achieve improved performance in various tasks and most prominently in image classification. Yet, most search strategies rely on large labeled datasets, which limit their usage in the case where only a smaller fraction of the data is annotated. Self-supervised learning has shown great promise in training neural networks using unlabeled data. In this work, we propose a self-supervised neural architecture search (SSNAS) that allows finding novel network models without the need for labeled data. We show that such a search leads to comparable results to supervised training with a ``fully labeled'' NAS. While such a result has been shown in concurrent works, the uniqueness of this work is that we also show that such a search can also improve the performance of self-supervised learning. We show that using the learned architectures for self-supervised representation learning leads to improved performance. Thus, SSL can both improve NAS and be improved by it. Specifically, due to the common case of resource constrains, we exhibit the advantage of our approach when the number of labels in the search is relatively small.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Yunhe_Wang1
Submission Number: 352
Loading