Residual adaptive sparse hybrid attention transformer for image super resolution

Published: 01 Jan 2024, Last Modified: 15 Nov 2024Eng. Appl. Artif. Intell. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Designed an image super-resolution network based on hybrid attention.•Implement adaptive sparse attention with BRA to capture long-distance dependencies.•Using ASHAB to simultaneously capture global, local, and long-range dependencies.•propose and use a hybrid loss function to obtain frequency domain supervision.•Our method achieves similar performance to SOTA with fewer parameters.
Loading