Feed-Forward 3D Gaussian Splatting Compression with Long-Context Modeling

15 Sept 2025 (modified: 13 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Compression, 3D Gaussian Splatting
Abstract: 3D Gaussian Splatting (3DGS) has emerged as a revolutionary 3D representation technique, while its substantial data size hinders broader applications. Feed-forward 3DGS compression, which avoids the time-consuming optimization required for per-scene per-train compressors, offers a promising solution for practical deployment. However, existing feed-forward compression methods struggle to model the long-range spatial dependencies, due to the limited receptive field of transform coding network and the inadequate context capacity utilized in entropy models. To address these issues, we propose a novel feed-forward 3DGS compression method that effectively exploits the long contexts. Specifically, we first formulate a large-scale context structure that comprises thousands of Gaussians based on Morton serialization. Then, we design a fine-grained space-channel auto-regressive entropy model to fully exploit this expansive context. Furthermore, we develop an attention-based transform coding model to extract informative latent priors by aggregating features from a wide range of neighboring Gaussians. The proposed method yields a $20\times$ compression ratio for 3DGS in a feed-forward inference and achieves state-of-the-art performance among the feed-forward codecs.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 5614
Loading