Abstract: The paradigm of training models on massive data with-
out label through self-supervised learning (SSL) and fine-
tuning on many downstream tasks has become a trend re-
cently. However, due to the high training costs and the un-
consciousness of downstream usages, most self-supervised
learning methods lack the capability to correspond to the
diversities of downstream scenarios, as there are various
data domains, different vision tasks and latency constraints
on models. Neural architecture search (NAS) is one uni-
versally acknowledged fashion to conquer the issues above,
but applying NAS on SSL seems impossible as there is no
label or metric provided for judging model selection. In
this paper, we present DATA, a simple yet effective NAS
approach specialized for SSL that provides Domain-Aware
and Task-Aware pre-training. Specifically, we (i) train a
supernet which could be deemed as a set of millions of
networks covering a wide range of model scales without
any label, (ii) propose a flexible searching mechanism com-
patible with SSL that enables finding networks of differ-
ent computation costs, for various downstream vision tasks
and data domains without explicit metric provided. Instan-
tiated With MoCo v2, our method achieves promising re-
sults across a wide range of computation costs on down-
stream tasks, including image classification, object detec-
tion and semantic segmentation. DATA is orthogonal to
most existing SSL methods and endows them the ability
of customization on downstream needs. Extensive experi-
ments on other SSL methods demonstrate the generalizabil-
ity of the proposed method. Code is released at https:
//github.com/GAIA-vision/GAIA-s
0 Replies
Loading