BoolNet: Streamlining Binary Neural Networks Using Binary Feature MapsDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Binary Neural Networks, Hardware-Friendly Neural Architecture Design
Abstract: Recent works on Binary Neural Networks (BNNs) have made promising progress in narrowing the accuracy gap of BNNs to their 32-bit counterparts, often based on specialized model designs using additional 32-bit components. Furthermore, most previous BNNs use 32-bit values for feature maps and residual shortcuts, which helps to maintain the accuracy, but is not friendly to hardware accelerators with limited memory, energy, and computing resources. Thus, we raise the following question: How can accuracy and energy consumption be balanced in a BNN design? We extensively study this fundamental problem in this work and propose BoolNet: an architecture without most commonly used 32-bit components that uses 1-bit values to store feature maps. Experimental results on ImageNet demonstrate that BoolNet can achieve 63.0% Top-1 accuracy coupled with an energy reduction of 2.95x compared to recent state-of-the-art BNN architectures. Code and trained models are available at: (URL in the final version).
One-sentence Summary: This is the first paper that investigates the challenge of balancing accuracy and hardware-friendlyness in a binary neural network design.
Supplementary Material: zip
15 Replies

Loading