Towards Elastic Image Super-Resolution Network via Progressive Self-distillation

Published: 01 Jan 2024, Last Modified: 11 Apr 2025PRCV (8) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, there has been a growing demand to implement super-resolution (SR) networks on devices with constrained resources. Nevertheless, most existing SR networks must remain consistent during both training and testing, restricting the model’s adaptability in real scenarios. Consequently, attaining elastic reconstruction without retraining is a crucial challenge. To accomplish this, we propose a novel model compression and acceleration framework through the Channel Splitting and Progressive Self-distillation (CSPS) strategy. Specifically, we construct a compact student network from the target teacher network by employing the channel splitting strategy, which removes a certain proportion of channel dimensions from the teacher network. Afterward, we incorporate auxiliary upsampling layers into the intermediate feature maps and propose the progressive self-distillation. Once trained, our CSPS can achieve elastic reconstruction by adjusting the channel splitting ratio and the number of feature extraction blocks. Extensive experiments demonstrate that the proposed CSPS can effectively compress and accelerate various off-the-shelf SR models.
Loading