BiMaCoSR: Binary One-Step Diffusion Model Leveraging Flexible Matrix Compression for Real Super-Resolution

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: A binarized one-step diffusion model for super-resolution.
Abstract: While super-resolution (SR) methods based on diffusion models (DM) have demonstrated inspiring performance, their deployment is impeded due to the heavy request of memory and computation. Recent researchers apply two kinds of methods to compress or fasten the DM. One is to compress the DM into 1-bit, aka binarization, alleviating the storage and computation pressure. The other distills the multi-step DM into only one step, significantly speeding up inference process. Nonetheless, it remains impossible to deploy DM to resource-limited edge devices. To address this problem, we propose BiMaCoSR, which combines binarization and one-step distillation to obtain extreme compression and acceleration. To prevent the catastrophic collapse of the model caused by binarization, we proposed sparse matrix branch (SMB) and low rank matrix branch (LRM). Both auxiliary branches pass the full-precision (FP) information but in different ways. SMB absorbs the extreme values and its output is high rank, carrying abundant FP information. Whereas, the design of LRMB is inspired by LoRA and is initialized with the top r SVD components, outputting low rank representation. The computation and storage overhead of our proposed branches can be safely ignored. Comprehensive comparison experiments are conducted to exhibit BiMaCoSR outperforms current state-of-the-art binarization methods and gains competitive performance compared with FP one-step model. Moreover, we achieve excellent compression and acceleration. BiMaCoSR achieves a 23.8x compression ratio and a 27.4x speedup ratio compared to FP counterpart. Our code and model are available at https://github.com/Kai-Liu001/BiMaCoSR
Lay Summary: Super-resolution models improve image quality but are often too large and slow for devices with limited resources. To solve this, we developed BiMaCoSR, a method that combines two strategies: compressing the model and speeding up its process. We use two new techniques to preserve important details without adding extra cost. BiMaCoSR is 23.8 times smaller and 27.4 times faster than previous models, making it much more efficient. Our code is available at [https://github.com/Kai-Liu001/BiMaCoSR](https://github.com/Kai-Liu001/BiMaCoSR).
Link To Code: https://github.com/Kai-Liu001/BiMaCoSR
Primary Area: Applications->Computer Vision
Keywords: super-resolution, one-step diffusion, binarization, quantization
Submission Number: 183
Loading