K-Free Dependence Bayesian Classifiers

Kexin Meng, Huan Zhang, Liangxiao Jiang, Pei Lv, Shuo He, Mingliang Xu

Published: 01 Jan 2026, Last Modified: 26 Feb 2026IEEE Transactions on Neural Networks and Learning SystemsEveryoneRevisionsCC BY-SA 4.0
Abstract: As one of the most attractive Bayesian network classifiers (BNCs), the K-dependence Bayesian (KDB) classifier can effectively capture dependencies between attributes by allowing each of them to be conditioned on the class and, at most, K other attributes. However, as K becomes larger, its structural complexity greatly increases, which inevitably leads to a certain risk of overfitting. Moreover, when K is given, its structure is immutable, which dramatically limits the expression ability of the final model. To address these two issues, in this study, we propose K-free dependence Bayesian (KFDB) classifiers, which can learn an adaptive number of parent nodes for each attribute. To search its optimal structure, we sequentially evaluate the candidate submodels either by minimizing the mean squared error (MSE) or maximizing the classification accuracy (ACC), resulting in two versions denoted as KFDBMSE and KFDBACC, respectively. Experimental results on 60 benchmark datasets demonstrate that KFDB significantly outperforms the classical KDB and other state-of-the-art models.
Loading