Session: Sampling and learning of deep neural networks (Philipp Petersen)
Keywords: Deep ReLU Neural Networks, Learning on Manifolds, Function Representation, Network Complexity
TL;DR: Studying the capability of deep ReLU neural networks to represent functions on manifolds by constructing local networks for coordinate neighborhoods and addressing mismatches across the global manifold.
Abstract: We explore the ability of deep ReLU neural networks to realize functions on manifolds. By establishing appropriate assumptions, we ensure that the coordinate charts can be exactly represented without error. Locally, we construct networks characterizing tooth functions on coordinate neighborhoods. To resolve mismatches arising from the manifold's complex structure, we further develop a global tooth function defined over the entire manifold, effectively represented by a ReLU neural network.
Submission Number: 11
Loading