Reg-PTQ: Regression-specialized Post-training Quantization for Fully Quantized Object Detector

Published: 01 Jan 2024, Last Modified: 13 May 2025CVPR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Although deep learning based object detection is of great significance for various applications, it faces challenges when deployed on edge devices due to the computation and energy limitations. Post-training quantization (PTQ) can improve inference efficiency through integer computing. However, they suffer from severe performance degra-dation when performing full quantization due to overlooking the unique characteristics of regression tasks in ob-ject detection. In this paper, we are the first to explore regression-friendly quantization and conduct full quantization on various detectors. We reveal the intrinsic reason behind the difficulty of quantizing regressors with empir-ical and theoretical justifications, and introduce a novel Regression-specialized Post-Training Quantization (Reg- PTQ) scheme. It includes Filtered Global Loss Integration Calibration to combine the global loss with a two-step fil-tering mechanism, mitigating the adverse impact of false positive bounding boxes, and Learnable Logarithmic-Affine Quantizer tailored for the non-uniform distributed param-eters in regression structures. Extensive experiments on prevalent detectors showcase the effectiveness of the well-designed Reg-PTQ. Notably, our Reg-PTQ achieves 7.6x and 5.4x reduction in computation and storage consumption under INT4 with little performance degradation, which indicates the immense potential of fully quantized detectors in real-world object detection applications.
Loading