Close Approximation of Kullback–Leibler Divergence for Sparse Source Retrieval

Published: 30 Apr 2019, Last Modified: 01 Nov 2024IEEE Signal Processing LettersEveryoneCC BY 4.0
Abstract: In this paper, we propose a fast and accurate ap- proximation of the Kullback-Leibler divergence (KLD) between two Bernoulli-Generalized Gaussian (Ber-GG) distributions. Such distribution has been found to be well-suited for modeling sparse signals like wavelet-based representations. Based on high bitrate approximations of the entropy of quantized Ber-GG sources, we provide a close approximation of the KLD without resorting to the conventional time-consuming Monte Carlo estimation ap- proach. The developed approximation formula is then validated in the context of depth map and stereo image retrieval.
Loading