Enhancing Image Restoration Transformer with Adaptive Token Dictionary

15 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: image restoration, transformer, dictionary learning
Abstract: Image restoration is a classic computer vision problem that involves estimating high-quality (HQ) images from low-quality (LQ) ones. To compensate the information loss in the degradation process, prior knowledge of HQ image is indispensable. While deep neural networks (DNNs), especially Transformers for image restoration, have seen significant advancements in recent years, challenges still remain, particularly in the explicit incorporation of external priors, managing computational complexity, and tailoring generalized external priors to image specifics. To address these issues, we propose to enhance Transformer with Adaptive Token Dictionary (ATD), leading to a novel architecture which introduces a token dictionary to explicitly model external prior in the attention mechanism. The proposed ATD calculates the attention between the input features and the token dictionary, which integrates similar features on a global scale. Furthermore, we propose an adaptive dictionary refinement mechanism (ADR) to progressively customize the shared tokens to image specifics from shallow to deep layers. Crucially, benefiting from the condensed token dictionary, the computational complexity of the new attention mechanism is reduced from quadratic to linear with respect to the number of image tokens. This efficiency makes our network notably advantageous in constrained settings. Experimental results show that our method achieves best performance on various image restoration benchmark.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 122
Loading