RemedyGS: Defend 3D Gaussian Splatting Against Computation Cost Attack

07 Sept 2025 (modified: 13 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: AI safety, 3DGS
TL;DR: We propose the first valid black-box defense framework against computational cost attack in 3DGS training.
Abstract: As a mainstream technique for 3D reconstruction, 3D Gaussian Splatting (3DGS) has been applied in a wide range of applications and services. Recent studies have revealed critical vulnerabilities in this pipeline and introduced computation cost attacks that lead to malicious resource occupancies and even denial-of-service (DoS) conditions, thereby hindering the reliable deployment of 3DGS. In this paper, we propose the first effective and comprehensive black-box defense framework, named RemedyGS, against such computation cost attacks, safeguarding 3DGS reconstruction systems and services. Our pipeline comprises two key components: a detector to identify the attacked input images with poisoned textures and a purifier to recover the benign images from their attacked counterparts, mitigating the adverse effects of these attacks. Moreover, we incorporate adversarial training into the purifier to enforce distributional alignment between the recovered and original natural images, thereby enhancing the defense efficacy. Experimental results demonstrate that our framework effectively safeguards 3DGS systems, achieving state-of-the-art performance in both safety and utility.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 2764
Loading