Analytical selection of hidden parameters through expanded enhancement matrix stability for functional-link neural networks and broad learning systems

Published: 01 Jan 2025, Last Modified: 13 May 2025Knowl. Based Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Functional-link neural network(FLNN) and its recent advancement, i.e., broad learning system(BLS), share the same mathematical essence in terms of the analytical solution to the parameters in their respective output layers. However, their performance often depends on an excessive number of enhancement nodes and the randomly assigned hidden parameters. This actually triggers a serious challenge: how to effectively select randomly assigned hidden parameters from the available candidates to simultaneously guarantee stronger generalization capability and fast training speed. In this study, we introduce the new concept of expanded enhancement matrix stability (EEMS) to address this challenge and identify the crucial factor for fast training in FLNN and BLS. We theoretically reveal the relationship between EEMS and the generalization capabilities of FLNN and BLS, in terms of the upper bounds of both the generalization error and the variance of cross-validation loss. Following this, we derive analytical (and hence fast) hidden parameter selection algorithms—EEMS-r and EEMS-cv—for both FLNN and BLS, based on EEMS with respect to the generalization error and the variance of cross-validation loss. Experimental results on 14 benchmarking datasets demonstrate the effectiveness of the proposed algorithms EEMS-r and EEMS-cv on most of the adopted datasets in enhancing generalization performance and reveal that a downsized structure of BLS may be empirically preset, in contrast to the standard BLS.
Loading