Integral Representations of Sobolev Spaces via ReLUActivation Function and Optimal Error Estimates for Linearized Networks
Abstract: This paper presents two main theoretical results concerning shallow neural networks with ReLU$^k$ activation functions. We establish a novel integral representation for Sobolev spaces, showing that every function in $H^{\frac{d+2k+1}{2}}(\Omega)$ can be expressed as an $L^2$-weighted integral of ReLU$^k$ ridge functions over the unit sphere. This result mirrors the known representation of Barron spaces and highlights a fundamental connection between Sobolev regularity and neural network representations. Moreover, we prove that linearized shallow networks -- constructed by fixed inner parameters and optimizing only the linear coefficients -- achieve optimal approximation rates $O(n^{-\frac{1}{2}-\frac{2k+1}{2d}})$ in Sobolev spaces.
Loading