Abstract: Covariance estimation is a core part of adaptive target detection. Most of the works focus on the Mean Squared Error (MSE) metric because it is easy to work with. However, MSE does not always capture the statistical information needed for detection. We advocate for switching to the Kullback-Leibler (KL) divergence. To support this, we analyze the Normalized Signal to Noise Ratio (NSNR) associated with the worst-case target. We show that the KL metric has a structure similar to NSNR and bounds it. To further clarify our point, we derive a simple variant of a classic MSE-based estimator by incorporating KL in a leave-one-out cross-validation (LOOCV) framework. Numerical experiments with various estimators on both synthetic and real data also demonstrate that KL and NSNR behave similarly and are different than MSE. Simply changing the metric in the LOOCV estimator improves KL and NSNR performance while reducing MSE performance.
External IDs:dblp:conf/ssp/BusbibDW25
Loading