Stochastic configuration networks with group lasso regularization

Published: 01 Jan 2024, Last Modified: 26 Jul 2025Inf. Sci. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Stochastic configuration networks (SCNs) construct randomized learner models incrementally in a node-by-node format under the guidance of its supervisory mechanism. Block-incremental SCNs (BSCN) extend the original SCNs with block increments to effectively reduce the number of iterations required during model building. Yet, two new issues emerge: the computationally expensive Moore-Penrose generalized inverse in inequality constraints, and some potential redundant hidden nodes. To address these limitations, this study presents efficient block-incremental SCNs (EBSCN) with group lasso regularization, termed EBSCNGL. The hidden block is treated as a specialized form of hidden node, and the output vector is directly replaced with the output matrix in the weights formula of SC-I (the first algorithmic implementation of SCNs) to evaluate the output weights of the newly added hidden block. Subsequently, a new set of inequalities without matrix generalized inverse is presented to ensure the universal approximation capability of EBSCN. Moreover, group lasso regularization is introduced to prune redundant nodes of the hidden layer. We further transform its regularized least-squares solution into an efficient form with proved convergence based on the Woodbury matrix identity. Empirical results on function approximation, benchmark classification, and a practical industrial application verify the efficiency and sparsity of our proposed method.
Loading