Learnable Heterogeneous Convolution: Learning both topology and strength

Published: 01 Jan 2021, Last Modified: 08 Apr 2025Neural Networks 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Existing convolution techniques in artificial neural networks suffer from huge computation complexity, while the biological neural network works in a much more powerful yet efficient way. Inspired by the biological plasticity of dendritic topology and synaptic strength, our method, Learnable Heterogeneous Convolution, realizes joint learning of kernel shape and weights, which unifies existing handcrafted convolution techniques in a data-driven way. A model based on our method can converge with structural sparse weights and then be accelerated by devices of high parallelism. In the experiments, our method either reduces VGG16/19 and ResNet34/50 computation by nearly 5×<math><mrow is="true"><mn is="true">5</mn><mo is="true">×</mo></mrow></math> on CIFAR10 and 2×<math><mrow is="true"><mn is="true">2</mn><mo is="true">×</mo></mrow></math> on ImageNet without harming the performance, where the weights are compressed by 10×<math><mrow is="true"><mn is="true">10</mn><mo is="true">×</mo></mrow></math> and 4×<math><mrow is="true"><mn is="true">4</mn><mo is="true">×</mo></mrow></math> respectively; or improves the accuracy by up to 1.0% on CIFAR10 and 0.5% on ImageNet with slightly higher efficiency. The code will be available on www.github.com/Genera1Z/LearnableHeterogeneousConvolution.
Loading