Hybrid Domain Convolutional Neural Network for Memory Efficient TrainingOpen Website

2021 (modified: 02 Nov 2022)CICAI 2021Readers: Everyone
Abstract: For many popular Convolutional Neural Networks (CNNs), memory has become one of the major constraints for their efficient training and inference on edge devices. Recently, it is shown that the bottleneck lies in the feature maps generated by convolutional layers. In this work, we propose a hybrid domain Convolutional Neural Network (HyNet) to reduce the memory footprint. Specifically, HyNet prunes the filters in the spatial domain and sparsifies the feature maps in the frequency domain. HyNet also introduces a specifically designed activation function in the frequency domain to preserve the sparsity of the feature maps while effectively strengthening training convergence. We evaluate the performance of HyNet by testing on three state-of-the-art networks (VGG, DenseNet, and ResNet) on several competitive image classification benchmarks (CIFAR-10, and ImageNet). We also compare HyNet with several memory-efficient training approaches. Overall, HyNet can reduce memory consumption by about $$\sim $$ 50% without significant accuracy loss.
0 Replies

Loading