Multiple Additive Neural Networks: A Novel Approach to Continuous Learning in Regression and Classification

Published: 01 Jan 2023, Last Modified: 08 Oct 2024IJCCI 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Gradient Boosting is one of the leading techniques for the regression and classification of structured data. Recent adaptations and implementations use decision trees as base learners. In this work, a new method based on the original approach of Gradient Boosting was adapted to nearly shallow neural networks as base learners. The proposed method supports a new architecture-based approach for continuous learning and utilises strong heuristics against overfitting. Therefore, the method that we call Multiple Additive Neural Networks (MANN) is robust and achieves high accuracy. As shown by our experiments, MANN obtains more accurate predictions on well-known datasets than Extreme Gradient Boosting (XGB), while also being less prone to overfitting and less dependent on the selection of the hyperparameters learn rate and iterations.
Loading