ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homology

Published: 18 Jun 2023, Last Modified: 21 Jul 2023TAGML2023 PosterEveryoneRevisions
Keywords: ReLU feedforward neural networks, polyhedra, polyhedral decompositions, Hamming distance, linear programming, persistent homology, barcode, topological invariants
Abstract: A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph. We show that while this dual graph is a coarse quantization of input space, it is sufficiently robust that it can be combined with persistent homology to detect homological signals of manifolds in the input space from samples. This property holds for a wide range of networks trained for a wide range of purposes that have nothing to do with this topological application. We found this feature to be surprising and interesting; we hope it will also be useful.
Type Of Submission: Proceedings Track (8 pages)
Submission Number: 10
Loading