Characterizing the Discrete Geometry of ReLU Networks

ICLR 2026 Conference Submission3505 Authors

09 Sept 2025 (modified: 25 Nov 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Polyhedrons, Geometry, ReLU, Activations
TL;DR: We describe the geometry of the polyhedral complexes defined by the linear regions of ReLU networks, both by theoretically bounding their connectivity and diameter and by empirically characterizing it using experiments on trained networks.
Abstract: It is well established that ReLU networks define continuous piecewise-linear functions, and that their linear regions are polyhedra in the input space. These regions form a complex that fully partitions the input space. The way these regions fit together is fundamental to the behavior of the network, as nonlinearities occur only at the boundaries where these regions connect. However, relatively little is known about the geometry of these complexes beyond bounds on the total number of regions, and calculating the complex exactly is intractable for most networks. In this work, we prove new theoretical results about these complexes that hold for all fully-connected ReLU networks, specifically about their connectivity graphs in which nodes correspond to regions and edges exist between each pair of regions connected by a face. We find that the average degree of this graph is upper bounded by twice the input dimension regardless of the width and depth of the network, and that the diameter of this graph has an upper bound that does not depend on input dimension, despite the number of regions increasing exponentially with input dimension. We corroborate our findings through experiments with networks trained on both synthetic and real-world data, which provide additional insight into the geometry of ReLU networks.
Supplementary Material: zip
Primary Area: learning theory
Submission Number: 3505
Loading