Scalable GPU-Accelerated Euler Characteristic Curves: Optimization and Differentiable Learning for PyTorch

Published: 23 Sept 2025, Last Modified: 27 Nov 2025NeurReps 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Topological Deep Learning, Topological Data Analysis, Geometric Representations, Euler Characteristic Curve, Euler Characteristic Transform, Differentiable Programming, GPU Computing, PyTorch
TL;DR: We introduce efficient GPU computation of the Euler Characteristic Curve as well as a learnable single direction PyTorch layer.
Abstract: Topological features capture global geometric structure in imaging data, but practical adoption in deep learning requires both computational efficiency and differentiability. We present optimized GPU kernels for the Euler Characteristic Curve (ECC) computation achieving 16-2000× speedups over prior GPU implementations on synthetic grids, and introduce a differentiable PyTorch layer enabling end-to-end learning. Our CUDA kernels, optimized for Ampere GPUs use 128B-coalesced access and hierarchical shared-memory accumulation. Our PyTorch layer learns thresholds in a single direction via a Differentiable Euler Characteristic Transform-style sigmoid relaxation. We discuss downstream relevance, including applications highlighted by prior ECC work, and outline batching/multi-GPU extensions to broaden adoption.
Submission Number: 57
Loading