Adaptive Resolution Residual Networks

Published: 31 Oct 2023, Last Modified: 10 Nov 2023DLDE III SpotlightTalkEveryoneRevisionsBibTeX
Keywords: Neural operator, Adaptive resolution, Rediscretization, Convolutional Network, Residual Network, Laplacian pyramid, Laplacian residual, Laplacian dropout
TL;DR: We introduce an architecture for signal-based machine learning tasks that can be adapted to different resolutions with great computational efficiency and robustness, and without difficult design constraints.
Abstract: We introduce Adaptive Resolution Residual Networks (ARRNs), a form of neural operator that enables the creation of networks for signal-based tasks that can be rediscretized to suit any signal resolution. ARRNs are composed of a chain of Laplacian residuals that each contain ordinary layers, which do not need to be rediscretizable for the whole network to be rediscretizable. ARRNs have the property of requiring a lower number of Laplacian residuals for exact evaluation on lower-resolution signals, which greatly reduces computational cost. ARRNs also implement Laplacian dropout, which encourages networks to become robust to low-bandwidth signals. ARRNs can thus be trained once at high-resolution and then be rediscretized on the fly at a suitable resolution with great robustness.
Submission Number: 24