How Well Does Self-Supervised Pre-Training Perform with Streaming ImageNet?Download PDF

Published: 24 Nov 2021, Last Modified: 05 May 2023ImageNet PPF 2021Readers: Everyone
Keywords: Self-supervised Learning, Pre-training, ImageNet, Streaming Data
TL;DR: We find sequential SSL is a more efficient yet performance-competitive representation learning practice for real-world pre-training applications.
Abstract: Prior works on self-supervised pre-training focus on the joint training scenario, where massive unlabeled data are assumed to be given as input all at once, and only then is a learner trained. Unfortunately, such a problem setting is often impractical if not infeasible since many real-world tasks rely on sequential learning, e.g., data are decentralized or collected in a streaming fashion. In this paper, we conduct the first thorough and dedicated investigation on self-supervised pre-training with streaming data, aiming to shed light on the model behavior under this overlooked setup. Specifically, we pre-train over 500 models on four categories of pre-training streaming data from ImageNet and DomainNet and evaluate them on three types of downstream tasks and 12 different downstream datasets. Our studies show that, somehow beyond our expectation, with simple data replay or parameter regularization, sequential self-supervised pre-training turns out to be an efficient alternative for joint pre-training, as the performances of the former are mostly on par with those of the latter. Moreover, catastrophic forgetting, a common issue in sequential supervised learning, is much alleviated in sequential self-supervised learning (SSL), which is well justified through our comprehensive empirical analysis on representations and the sharpness of minima in the loss landscape. Our findings, therefore, suggest that, in practice, for SSL, the cumbersome joint training can be replaced mainly by sequential learning, which in turn enables a much broader spectrum of potential application scenarios.
Submission Track: Main track, 5 pages max
Supplementary Materials: pdf
Reviewer Emails: lhxxhb15@gmail.com
Poster: pdf
1 Reply

Loading