Keywords: Laplacian-free diffusion, Sinkhorn normalization, Mass-preserving diffusion operator, heat kernel approximation, geometric data analyis, learning on unstructured data, similarity kernels, spectral methods, geometry processing, Laplace–Beltrami, manifold learning, gaussian splatting, optimal transport
TL;DR: This work presents a method to compute heat diffusion-like operators from similarity matrices using optimal transport, applicable to generic unstructured data such as point clouds, voxel soups or gaussian splats.
Abstract: Smoothing a signal based on local neighborhoods is a core operation in machine learning and geometry processing. On well-structured domains such as vector spaces and manifolds, the Laplace operator derived from differential geometry offers a principled approach to smoothing via heat diffusion, with strong theoretical guarantees. However, constructing such Laplacians requires a carefully defined domain structure, which is not always available. Most practitioners thus rely on simple convolution kernels and message-passing layers, which are biased against the boundaries of the domain.
We bridge this gap by introducing a broad class of *smoothing operators*, derived from general similarity or adjacency matrices, and demonstrate that they can be normalized into *diffusion-like operators* that inherit desirable properties from Laplacians. Our approach relies on a symmetric variant of the Sinkhorn algorithm, which rescales positive smoothing operators to match the structural behavior of heat diffusion.
This construction enables Laplacian-like smoothing and processing of irregular data such as point clouds, sparse voxel grids or mixture of Gaussians. We show that the resulting operators not only approximate heat diffusion but also retain spectral information from the Laplacian itself, with applications to shape analysis and matching.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 12901
Loading