Keywords: image super-resolution, sparsity, lottery ticket
TL;DR: We explore the problem of parameter efficiency in the image super-resolution network through the method of thinning, and recognize the lottery hypothesis from a new perspective.
Abstract: The over parameterization of neural networks has been widely concerned for a long time. This gives us the opportunity to find a sub-networks that can improve the parameter efficiency of neural networks from a over parameterized network. In our study, we used EDSR as the backbone network to explore the parameter efficiency in super-resolution(SR) networks in the form of sparsity. Specifically, we search for sparse sub-networks at the two granularity of weight and kernel through various methods, and analyze the relationship between the structure and performance of the sub-networks. (1) We observe the ``Lottery Ticket Hypothesis'' from a new perspective in the regression task of SR on weight granularity. (2) On convolution kernel granularity, we apply several methods to explore the influence of different sparse sub-networks on network performance and found that based on certain rules, the performance of different sub-networks rarely depends on their structures. (3) We propose a very convenient width-sparsity method on convolution kernel granularity, which can improve the parameter utilization efficiency of most SR networks.
0 Replies
Loading