A Simple Geometric Proof for the Benefit of Depth in ReLU NetworksDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
TL;DR: ReLU MLP depth seperation proof with gemoteric arguments
Abstract: We present a simple proof for the benefit of depth in multi-layer feedforward network with rectifed activation (``"depth separation"). Specifically we present a sequence of classification problems f_i such that (a) for any fixed depth rectified network we can find an index m such that problems with index > m require exponential network width to fully represent the function f_m; and (b) for any problem f_m in the family, we present a concrete neural network with linear depth and bounded width that fully represents it. While there are several previous work showing similar results, our proof uses substantially simpler tools and techniques, and should be accessible to undergraduate students in computer science and people with similar backgrounds.
Original Pdf: pdf
5 Replies

Loading