A decision cognizant Kullback-Leibler divergenceOpen Website

2017 (modified: 13 Jun 2021)Pattern Recognit. 2017Readers: Everyone
Abstract: Highlights • The decision cognizant Kullback–Leibler divergence is a better statistic to measure classifier (in)congruence. • Analytic and simulation studies show the new divergence is more robust to minority class clutter. • Sensitivity to estimation error is lower than that of the classical Kullback–Leibler divergence. Abstract In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback–Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback–Leibler divergence (DC-KL), to reduce the contribution of the minority classes, which obscure the true degree of classifier incongruence. We investigate the properties of the novel divergence measure analytically and by simulation studies. The proposed measure is demonstrated to be more robust to minority class clutter. Its sensitivity to estimation noise is also shown to be considerably lower than that of the classical KL divergence. These properties render the DC-KL divergence a much better statistic for discriminating between classifier congruence and incongruence in pattern recognition systems.
0 Replies

Loading