Neural networks can be FLOP-efficient integrators of 1D oscillatory integrands

Published: 08 Apr 2024, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We demonstrate that neural networks can be FLOP-efficient integrators of one-dimensional oscillatory integrands. We train a feed-forward neural network to compute integrals of highly oscillatory 1D functions. The training set is a parametric combination of functions with varying characters and oscillatory behavior degrees. Numerical examples show that these networks are FLOP-efficient for sufficiently oscillatory integrands with an average FLOP gain of $10^{3}$ FLOPs. The network calculates oscillatory integrals better than traditional quadrature methods under the same computational budget or number of floating point operations. We find that feed-forward networks of 5 hidden layers are satisfactory for a relative accuracy of $10^{-3}$. The computational burden of inference of the neural network is relatively small, even compared to inner-product pattern quadrature rules. We postulate that our result follows from learning latent patterns in the oscillatory integrands that are otherwise opaque to traditional numerical integrators.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Camera-ready version (minor formatting changes mostly).
Code: https://github.com/comp-physics/deepOscillations
Assigned Action Editor: ~Manuel_Haussmann1
Submission Number: 2129
Loading