Incremental Learning Algorithms for Broad Learning System with Node and Input Addition

Published: 01 Jan 2024, Last Modified: 13 May 2025SMC 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The Broad Learning System (BLS) has been established as an effective flat network alternative to Deep Neural Networks (DNNs), delivering high efficiency while achieving competitive accuracy. Despite its advantages, the incremental learning methods of BLS face challenges in stability and computation when expanding with new nodes or input. We introduce two novel incremental learning algorithms based on factorization updates for BLS that optimize node and input additions to overcome these limitations. Our node addition algorithm utilizes QR decomposition and Cholesky factorization, using the update of the Cholesky factor instead of pseudo-inverse computations. For input addition, we propose an iterative Cholesky factor update algorithm. Our algorithms demonstrate not only faster computation compared to the existing BLS but also improved testing accuracy on the MNIST or Fashion-MNIST dataset. This work presents a significant step forward in the practical application and scalability of BLS in various data-dense environments.
Loading