Kernel Normalized Convolutional Networks

Published: 02 Mar 2024, Last Modified: 02 Mar 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Existing convolutional neural network architectures frequently rely upon batch normalization (BatchNorm) to effectively train the model. BatchNorm, however, performs poorly with small batch sizes, and is inapplicable to differential privacy. To address these limitations, we propose the kernel normalization (KernelNorm) and kernel normalized convolutional layers, and incorporate them into kernel normalized convolutional networks (KNConvNets) as the main building blocks. We implement KNConvNets corresponding to the state-of-the-art ResNets while forgoing the BatchNorm layers. Through extensive experiments, we illustrate that KNConvNets achieve higher or competitive performance compared to the BatchNorm counterparts in image classification and semantic segmentation. They also significantly outperform their batch-independent competitors including those based on layer and group normalization in non-private and differentially private training. Given that, KernelNorm combines the batch-independence property of layer and group normalization with the performance advantage of BatchNorm.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=GB3blmq966
Video: https://youtu.be/V7fQTc6MNSE
Code: https://github.com/reza-nasirigerdeh/norm-torch
Assigned Action Editor: ~Dumitru_Erhan1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1584
Loading