Weight-Sharing Method for Upsampling Layer from Feature Embedding Recursive Block

Published: 09 Oct 2024, Last Modified: 19 Nov 2024Compression Workshop @ NeurIPS 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Weight-Sharing, Upsampling, Super-Resolution, Laplacian-Pyramid, Machine-Learning
Abstract: In the field of super-resolution, the Laplacian pyramid framework-based model needs to estimate the result of the inverse convolution for upscaling layers. Generally, the transposed convolution is applied to estimate the result close to the inverse convolution. In this process, the transposed convolution can be designed efficiently to reduce the trainable weights. In this study, we propose a new model compression method that replaces the transposed convolution layer by sharing the weights of the convolution layer trained in the feature embedding recursive block. The proposed weight-sharing method effectively reduces training complexity and training time. The experiments demonstrate the results accordingly, even for relatively large image sizes.
Submission Number: 6
Loading