Block Pruning And Collaborative Distillation For Image Classification Of Remote Sensing

Published: 01 Jan 2024, Last Modified: 19 Feb 2025IGARSS 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Convolutional Neural Networks (CNNs) have achieved remarkable performance in remote sensing image classification tasks. To address the issue of high model complexity, we propose a block-level pruning strategy based on the semantic similarity analysis that no fine-tuning is required during the pruning process. By employing this strategy, we effectively reduce the complexity of the model. Furthermore, to restore the overall performance of the pruned model, we propose a teacher-student collaborative distillation strategy that enables knowledge transfer through the collaboration of the original model and the dropped-blocks model to promote the performance of the compact pruned model. Experimental results demonstrate that our pruning and distillation strategies outperform other approaches, thereby achieving favorable performance while reducing model complexity.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview