Pre-pruned Distillation for Point Cloud-based 3D Object Detection

Published: 01 Jan 2024, Last Modified: 05 Mar 2025IV 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Knowledge distillation has recently been proven to be effective for model compression and acceleration of point cloud-based 3D object detection. However, the complementary network pruning is often overlooked during knowledge distillation. In this paper, we propose a pre-pruned distillation framework that combines network pruning and knowledge distillation to better transfer knowledge from the teacher to the student. To maintain the feature consistency between the student and the teacher, we train a teacher model and then generate a compact student model by structural channel pruning. Then, we employ multi-source knowledge distillation to transfer both mid-level and high-level information to the student model. Additionally, to improve the object detection performance of the student model, we propose a soft pivotal position selection mask to emphasize the features of the foreground regions during distillation. We conduct experiments on both pillarand voxel-based 3D object detectors on the Waymo datasets, demonstrating the effectiveness of our approach in compressing point cloud-based 3D detectors.
Loading