GeMS: Efficient Gaussian Splatting for Extreme Motion Blur

TMLR Paper4917 Authors

22 May 2025 (modified: 27 May 2025)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We introduce GeMS, a framework for 3D Gaussian Splatting designed to handle severely motion-blurred images. State-of-the-art deblurring method for extreme motion blur, such as ExBluRF, as well as Gaussian Splatting-based approaches like Deblur-GS, typically assume access to corresponding sharp images for camera pose estimation and point cloud generation, which is an unrealistic assumption. Additionally, methods relying on COLMAP initialization, such as BAD-Gaussians, fail due to the lack of reliable feature correspondences in cases of severe motion blur. To address these challenges, we propose GeMS, a 3D Gaussian Splat- ting framework that reconstructs scenes directly from extremely motion-blurred images. GeMS integrates: (1) VGGSfM, a deep learning-based SfM pipeline which estimates camera poses and generates point clouds directly from severely motion-blurred images; (2) MCMC - based Gaussian Splatting, which enables robust scene initialization by treating Gaussians as samples from an underlying probability distribution, eliminating heuristic densification and pruning strategies; and (3) Joint optimization of camera motion trajectory and Gaussian parameters which ensures stable and accurate reconstruction. While this pipeline produces reasonable reconstructions, extreme motion blur can still introduce inaccuracies, especially when all input views are severely blurred. To address this, we propose GeMS-E, which integrates a progressive refinement step when event data is available. Specifically, we perform (4) Event-based Double Integral (EDI) deblurring, which first restores deblurred images from motion-blurred inputs. These deblurred images are then fed into the GeMS framework, lead- ing to improved pose estimation, point cloud generation, and hence overall reconstruction quality. Both GeMS & GeMS-E achieve state-of-the-art performance on synthetic as well as real-world datasets, demonstrating their effectiveness in handling extreme motion blur. To the best of our knowledge, we are the first to effectively address this problem in extreme blur scenarios within a 3D Gaussian Splatting framework, without requiring sharp images for SfM (pose and point cloud) initialization.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Ming-Hsuan_Yang1
Submission Number: 4917
Loading