Neural Networks with Adaptive Activation Functions and their Application to the Solution of PDEs

17 Sept 2025 (modified: 22 Dec 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Sampling-Based Neural Networks, Approximation, Adaptive Parameters, Solution of PDEs, Rational Function
Abstract: We study fully connected neural networks for function approximation, focusing on the role of adaptive activation functions. Recent work has shown that introducing trainable parameters into activation functions, particularly rational functions, can substantially improve network expressiveness. We extend this idea to sampling-based neural networks, which replace backpropagation with forward sampling to achieve faster training. Existing sampling-based methods, however, only employ fixed activation functions, limiting their performance. To address this gap, we develop a computational framework that integrates adaptive activation functions into sampling-based training, enabling direct learning of activation parameters through sampling without gradient-based optimization. Experiments demonstrate that our approach preserves the efficiency of sampling-based methods while significantly improving approximation accuracy, highlighting the benefits of parameterized activation functions in non-gradient training regimes.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 9910
Loading