A Lightweight Multi-Scale Based Attention Network for Image Super-Resolution

Published: 01 Jan 2023, Last Modified: 12 Apr 2025IECON 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In this paper, we propose a lightweight multi-scale based attention network (MBAN) for single-image super-resolution (SISR). First, a deep feature transform block (DFTB) is designed for multi-scale feature extraction; this block combines group convolution and improved channel attention (ICA) for performance purposes while remaining sufficiently lightweight. Second, a dual multi-scale attention block (DMAB) is proposed for long-range information interaction; this block employs different window sizes for self-attention (SA) and short connections between different branches to achieve multiscale attention interaction. Finally, our MBAN is constructed by cascaded multi-scale based attention blocks (MBABs) that perform detail restoration; these blocks simultaneously extract multi-scale local features and integrate multi-scale global features with the DFTBs and DMABs. Extensive experiments suggest the superiority of our MBAN over the state-of-the-art (SOTA) lightweight SR methods in terms of both quantitative metrics and visual quality.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview