Abstract: Recent years, numerous attribute weighting methods have been proposed to alleviate the attribute conditional independence assumption in naive Bayes (NB). Among them, multi-view attribute weighted naive Bayes (MAWNB) has achieved state-of-the-art performance by constructing two label views from the raw attributes, which enables it to capture more comprehensive data characteristics. However, in its multi-view construction module, the instances used for base classifier training are also employed to generate the label views, which inevitably introduces a risk of over-fitting. Additionally, in its multi-view fusion module, the estimated class-membership probabilities from different views are simply averaged to predict the class label, ignoring the varying importance of each view when classifying different test instances. To address these issues, in this study, we propose an enhanced version of MAWNB to simultaneously improve its generalization and specificity. The resulting model is termed EMAWNB. Extensive experiments conducted on 59 benchmark datasets demonstrate that EMAWNB significantly outperforms the original MAWNB as well as other state-of-the-art competitors.
Loading