Keywords: fourier adapter, parameter-efficient transfer learning, image restoration, degradation-aware gating, spectral modulation
TL;DR: FraIR introduces a Fourier-domain, degradation-aware adapter for efficient transfer learning in image restoration, achieving state-of-the-art performance with minimal parameter overhead and zero inference cost.
Abstract: Restoring high-quality images from degraded inputs is a core challenge in computer vision, especially under diverse or compound distortions. While large-scale all-in-one models offer strong performance, they are computationally expensive and poorly generalize to unseen degradations. Parameter-Efficient Transfer Learning (PETL) provides a scalable alternative, but most methods operate in the spatial domain and struggle to adapt to frequency-sensitive artifacts like blur, noise, or compression. We propose \textbf{FraIR}, a Fourier-based Recomposition Adapter for image restoration that enables efficient and expressive adaptation in the spectral domain. FraIR applies a 1D Fourier Transform to decompose token features into frequency components, performs low-rank adaptation via spectral projections with learnable reweighting, and reconstructs the adapted signal using an inverse transform gated by task-specific modulation. Integrated as plug-and-play modules within Transformer layers, FraIR is reparameterizable for zero-latency inference and requires less than 0.5% additional parameters. Extensive experiments across denoising, deraining, super-resolution, and hybrid-degradation benchmarks show that FraIR outperforms prior PETL methods and matches or exceeds fully fine-tuned baselines demonstrating strong generalization with minimal cost. Unlike prior Fourier-based approaches that focus on generative modeling or static modulation, FraIR offers dynamic, degradation-aware recomposition in frequency space for efficient restoration.
Supplementary Material: pdf
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 25021
Loading