Polyhedral Complex Extraction from ReLU Networks using Edge Subdivision

Published: 18 Jun 2023, Last Modified: 27 Jun 2023TAGML2023 PosterEveryoneRevisions
Keywords: ReLU, folded, hyperplane, arrangement, polyhedral, complex, sign-vector, subdivision
TL;DR: Subdividing edges is much faster than subdividing regions in a polyhedral complex of a ReLU network.
Abstract: A NN consisting of piecewise affine building blocks, such as fully-connected layers and ReLU activations, is itself a piecewise affine function supported on a polyhedral complex. This complex has been studied to characterize theoretical properties of NNs and linked to geometry representations, but, in practice, extracting it remains a challenge. Previous works subdivide the regions via intersections with hyperplanes induced by each neuron. Instead, we propose to subdivide the edges, leading to a novel method for polyhedral complex extraction. This alleviates computational redundancy and affords efficient data-structures. A key to this are sign-vectors, which encode the combinatorial structure of the complex. Our implementation (available on GitHub) uses standard tensor operations and can run exclusively on the GPU, taking seconds for millions of cells on a consumer grade machine. Motivated by the growing interest in neural shape representation, we use the speed and differentiablility of our method to optimize geometric properties of the complex. Our code is available at https://github.com/arturs-berzins/relu_edge_subdivision.
Type Of Submission: Extended Abstract (4 pages, non-archival)
Submission Number: 70
Loading