Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions

Published: 01 Jan 2024, Last Modified: 21 May 2024AAAI 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The training problem of neural networks (NNs) is known to be ER-complete with respect to ReLU and linear activation functions. We show that the training problem for NNs equipped with arbitrary activation functions is polynomial-time bireducible to the existential theory of the reals extended with the corresponding activation functions. For effectively continuous activation functions (e.g., the sigmoid function), we obtain an inclusion to low levels of the arithmetical hierarchy. Consequently, the sigmoid activation function leads to the existential theory of the reals with the exponential function, and hence the decidability of training NNs using the sigmoid activation function is equivalent to the decidability of the existential theory of the reals with the exponential function, a long-standing open problem. In contrast, we obtain that the training problem is undecidable if sinusoidal activation functions are considered.
Loading