Size and depth of monotone neural networks: interpolation and approximationDownload PDF

Published: 31 Oct 2022, Last Modified: 28 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: Monotone neural networks, interpolation, expressivity, benefit of depth
TL;DR: We provide new constructions of constant depth neural networks with positive parameters and threshold units for interpolation and approximation
Abstract: Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with $n$ points, and the goal is to find a size and depth efficient monotone neural network with \emph{non negative parameters} and threshold units that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth $2$. On the other hand, we prove that for every monotone data set with $n$ points in $\mathbb{R}^d$, there exists an interpolating monotone network of depth $4$ and size $O(nd)$. Our interpolation result implies that every monotone function over $[0,1]^d$ can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth $d+1$. Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.
Supplementary Material: pdf
8 Replies

Loading