Self-Supervised Learning for Binary Networks by Joint Classifier TrainingDownload PDF

29 Sept 2021 (modified: 04 May 2025)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: binary networks, unsupervised representation learning
Abstract: Despite the great success of self-supervised learning with large floating point networks, such networks are not readily deployable to edge devices. To accelerate deployment of models to edge devices for various downstream tasks by unsupervised representation learning, we propose a self-supervised learning method for binary networks. In particular, we propose to use a randomly initialized classifier attached to a pretrained floating point feature extractor as targets and jointly train it with a binary network. For better training of the binary network, we propose a feature similarity loss, a dynamic balancing scheme of loss terms, and modified multi-stage training. We call our method as BSSL. Our empirical validations show that BSSL outperforms self-supervised learning baselines for binary networks in various downstream tasks and outperforms supervised pretraining in certain tasks.
One-sentence Summary: The first SSL method specifically for binary networks with no label required in any training procedure.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/self-supervised-learning-for-binary-networks/code)
1 Reply

Loading