Abstract: Let φ be a bounded function on (−∞,+∞) and lim x→+∞ φ(x) = M, lim x→−∞ φ(x) = m, which is called general sigmoidal function. Using the general sigmoidal function as the activation function, we first construct a type of single hidden layer feedforward neural networks (FNNs) with n + 1 hidden neurons, which can learn n + 1 distinct samples with zero error. Then we present a class of FNNs with single hidden layer, namely, the approximate interpolation neural networks, which can approximately interpolate, with arbitrary precision, any set of distinct data in one dimension. Finally, we estimate the errors between the exact and approximate interpolation neural networks by means of the algebraic methods.
Loading