Incremental Natural Gradient Boosting for Probabilistic Regression

Published: 01 Jan 2023, Last Modified: 05 Jun 2025ADMA (1) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The natural gradient boosting method for probabilistic regression \((\mathrm {{\textbf {NGBoost}}})\) is capable of predicting not only point estimates but also target distributions under sample conditions, thereby quantifying prediction uncertainty. However, NGBoost is designed only for batch settings, which are not well-suited for data stream learning. In this paper, we present an incremental natural gradient boosting method for probabilistic regression \((\mathrm {{\textbf {INGBoost}}})\). The proposed method employs scoring rule reduction as a metric and applies the Hoeffding inequality incrementally to construct decision trees that fit the natural gradient, thus achieving incremental natural gradient boosting. Experimental results demonstrate that INGBoost performs well in both point regression and probabilistic regression tasks while maintaining the interpretability of the tree model. Furthermore, the model size of INGBoost is significantly smaller than that of NGBoost.
Loading