Abstract: Convolutional Neural Networks (CNNs) have achieved remarkable performance in remote sensing image classification tasks. To address the issue of high model complexity, we propose a block-level pruning strategy based on the semantic similarity analysis that no fine-tuning is required during the pruning process. By employing this strategy, we effectively reduce the complexity of the model. Furthermore, to restore the overall performance of the pruned model, we propose a teacher-student collaborative distillation strategy that enables knowledge transfer through the collaboration of the original model and the dropped-blocks model to promote the performance of the compact pruned model. Experimental results demonstrate that our pruning and distillation strategies outperform other approaches, thereby achieving favorable performance while reducing model complexity.
Loading