Modumer: Modulating Transformer for Image Restoration

15 Sept 2024 (modified: 12 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Image restoration, Transformer block, Modulation design
Abstract: Image restoration aims to recover clean images from degraded versions. While Transformer-based approaches have achieved significant advancements in this field, they are limited by high complexity and their inability to capture omni-range dependencies, hindering their overall performance. In this work, we develop Modumer for effective and efficient image restoration by revisiting the Transformer block and Modulation design, which processes input through a convolutional block and projection layers, and fuses features via element-wise multiplication. Specifically, within each unit of Modumer, we integrate the cascaded Modulation design with the downsampled Transformer block to build the attention layers, enabling omni-kernel modulation and mapping inputs into high-dimensional feature spaces. Moreover, we introduce a bioinspired parameter-sharing mechanism to attention layers, which not only enhances efficiency but also improves performance. Additionally, a dual-domain feed-forward network strengthens the representational power of the model. Extensive experiments demonstrate that the proposed Modumer achieves state-of-the-art performance on ten different datasets for five image restoration tasks: image motion deblurring, image deraining, image dehazing, image desnowing, and low-light image enhancement. Furthermore, our model yields promising performance on all-in-one image restoration tasks.
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 943
Loading