A Large-scale Dataset of Gaussian Splats and Their Self-Supervised Pretraining

Published: 23 Mar 2025, Last Modified: 24 Mar 20253DV 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: 3D object dataset, self-supervised pretraining, representation learning, gaussian splatting
TL;DR: We present ShapeSplat dataset together with our Gaussian-MAE method, which enable the masked pretraining directly on 3DGS parameters, leading to superior performance in downstream tasks.
Abstract: 3D Gaussian Splatting (3DGS) has become the de facto method of 3D representation in many vision tasks. This calls for the 3D understanding directly in this representation space. To facilitate the research in this direction, we first build a large-scale dataset of 3DGS using the commonly used ShapeNet and ModelNet datasets. Our dataset ShapeSplat consists of 65K objects from 87 unique categories, whose labels are in accordance with the respective datasets. The creation of this dataset utilized the compute equivalent of 2 GPU years on a TITAN XP GPU. We utilize our dataset for unsupervised pretraining and supervised finetuning for classification and segmentation tasks. To this end, we introduce Gaussian-MAE, which highlights the unique benefits of representation learning from Gaussian parameters. Through exhaustive experiments, we provide several valuable insights. In particular, we show that (1) the distribution of the optimized GS centroids significantly differs from the uniformly sampled point cloud (used for initialization) counterpart; (2) this change in distribution results in degradation in classification but improvement in segmentation tasks when using only the centroids; (3) to leverage additional Gaussian parameters, we propose Gaussian feature grouping in a normalized feature space, along with splats pooling layer, offering a tailored solution to effectively group and embed similar Gaussians, which leads to notable improvement in finetuning tasks. Our dataset and model are publicly available at https://unique1i.github.io/ShapeSplat.
Supplementary Material: zip
Submission Number: 126
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview