Histogram-Equalized Quantization for logic-gated Residual Neural Networks

Published: 01 Jan 2022, Last Modified: 12 Nov 2024ISCAS 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Adjusting the quantization according to the data or to the model loss seems mandatory to enable a high accuracy in the context of quantized neural networks. This work presents Histogram-Equalized Quantization (HEQ), an adaptive framework for linear and symmetric quantization. HEQ automatically adapts the quantization thresholds using a unique step size optimization. We empirically show that HEQ achieves state-of-the-art performances on CFAR-10. Experiments on the STL-10 dataset even show that HEQ enables a proper training of our proposed logic-gated (OR, MUX) residual networks with a higher accuracy at a lower hardware complexity than previous work.
Loading