Broad Learning System with Proportional-Integral-Differential Gradient DescentOpen Website

Published: 01 Jan 2020, Last Modified: 10 May 2023ICA3PP (1) 2020Readers: Everyone
Abstract: Broad learning system (BLS) has attracted much attention in recent years due to its fast training speed and good generalization ability. Most of the existing BLS-based algorithms use the least square method to calculate its output weights. As the size of the training data set increases, this approach will cause the training efficiency of the model to be seriously reduced, and the solution of the model will also be unstable. To solve this problem, we have designed a new gradient descent method (GD) based on the proportional-integral-differential technique (PID) to replace the least square operation in the existing BLS algorithms, which is called PID-GD-BLS. Extensive experimental results on four benchmark data sets show that PID-GD can achieve faster convergence rate than traditional optimization algorithms such as Adam and AdaMod, and the generalization performance and stability of the PID-GD-BLS are much better than that of BLS and its variants. This study provides a new direction for BLS optimization and a better solution for BLS-based data mining.
0 Replies

Loading