Abstract: Volume parameterizations abound in recent literature, encompassing methods from classic voxel grids to implicit neural representations.
While implicit representations offer impressive capacity and improved memory efficiency compared to voxel grids, they traditionally require training through nonconvex optimization, which can be slow and sensitive to initialization and hyperparameters.
We introduce GA-Planes, a novel family of implicit neural volume representations inspired by Geometric Algebra that can be trained using convex optimization, addressing the limitations of nonconvex methods.
GA-Planes models generalize many existing representations including any combination of features stored in tensor basis elements followed by a neural feature decoder, and can be adapted to convex or nonconvex training as needed for various inverse problems.
In the 2D setting, we prove GA-Planes models are equivalent to a low-rank plus low-resolution matrix factorization that outperforms the classic low-rank plus sparse decomposition for fitting a natural image.
In 3D, GA-Planes models exhibit competitive expressiveness, model size, and optimizability across tasks such as radiance field reconstruction, 3D segmentation, and video segmentation.
Lay Summary: This paper introduces GA-Planes (Geometric Algebra Planes), a new method for representing 3D objects and scenes. In computer graphics and machine learning, various formats—like grids, neural networks, or combinations of both—are commonly used to store and process 3D data. Implicit representations are a popular choice because they offer high capacity and memory efficiency compared to traditional volumetric grids. However, they are often difficult and slow to train due to complex, nonconvex optimization.
GA-Planes addresses this challenge by using ideas from Geometric Algebra to create a more efficient and stable representation that supports convex optimization. It generalizes many existing methods that combine learnable position-dependent feature grids with neural decoders. By enabling convex training, GA-Planes reduces reliance on careful initialization and tuning.
The authors demonstrate that GA-Planes is versatile and can be applied to a wide range of tasks, including 2D image fitting, 3D scene reconstruction, and video representation.
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Link To Code: https://github.com/sivginirmak/Geometric-Algebra-Planes
Primary Area: Optimization->Convex
Keywords: Volume representation, tensor decomposition, convex optimization, geometric algebra, nerf
Submission Number: 8972
Loading