Gated normalization unit for image restoration

Published: 01 Jan 2025, Last Modified: 18 Jul 2025Pattern Anal. Appl. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Image restoration has been an integral part of image processing research with the goal of converting degraded images into clear ones. While some networks have achieved state-of-the-art results through architecture and module design, little attention has been paid to the adaptation of normalization methods in image restoration tasks. Normalization methods are crucial in deep learning. In this work, we attempt to combine gating mechanisms with normalization methods. Gated mechanisms are popular in feature extraction and information filtering, and combining them with normalization methods has potential for designing image restoration algorithms. Firstly, we propose a Simple Gated Attention Unit (SGAU), a block using a simple gating mechanism to validate the potential of gating mechanisms. Then, we propose a new normalization block, Gated Instance Normalization (GIN), and introduce a new normalization method, Global Response Normalization (GRN), for image restoration tasks. Both GIN and GRN combine gating mechanisms with normalization methods for feature extraction, fusion, and integration. Finally, we propose a two-stage network, Gated Normalization Network (GNNet), utilizing GIN and GRN as blocks to effectively extract and filter information. Deep separable convolutions are used in the deep layers to reduce parameters while preserving spatial information, improving local feature perception. An improved cross-stage feature fusion (ICSFF) block is used for feature information transfer between stages, and a supervised attention module (SAM) is used as input to the second stage network from the first stage output. Through various image restoration tasks, we achieve 32.93 dB PSNR on GoPro, 30.42 dB PSNR on HIDE for image deblurring, 39.94 dB PSNR on SIDD for real-world denoising, and good performance in Gaussian white noise denoising and image deraining tasks. Moreover, the GIN and GRN only generated a small number of gated weight and bias parameters, and compared to other multi-stage networks, the model size is reduced, and computational complexity is well balanced.
Loading