Event-Guided Rolling Shutter Correction with Time-Aware Cross-Attentions

Published: 20 Jul 2024, Last Modified: 21 Jul 2024MM2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Many consumer cameras with rolling shutter (RS) CMOS would suffer undesired distortion and artifacts, particularly when objects experiences fast motion. The neuromorphic event camera, with high temporal resolution events, could bring much benefit to the RS correction process. In this work, we explore the characteristics of RS images and event data for the design of the rolling shutter correction (RSC) model. Specifically, the relationship between RS images and event data is modeled by incorporating time encoding to the computation of cross-attention in transformer encoder to achieve time-aware multi-modal information fusion. Features from RS images enhanced by event data are adopted as keys and values in transformer decoder, providing source for appearance, while features from event data enhanced by RS images are adopted as queries, providing spatial transition information. By embedding the time information of the desired GS image into the query, the transformer with deformable attention is capable of producing the target GS image. To enhance the model's generalization ability, we propose to further self-supervise the model by cycling between time coordinate systems corresponding to RS images and GS images. Extensive evaluations over both synthetic and real datasets demonstrate that the proposed method performs favorably against state-of-the-art approaches.
Primary Subject Area: [Content] Multimodal Fusion
Relevance To Conference: Event cameras are bio-inspired sensors with high dynamic range and time resolution, which are complementary to the frame based cameras. In this work, we design a fusion algorithm for RGB and event multimodal data to address the image distortion caused by rolling shutter effects.
Supplementary Material: zip
Submission Number: 1354
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview