Entropy Bounds on Abelian Groups and the Ruzsa DivergenceDownload PDFOpen Website

2018 (modified: 07 Nov 2022)IEEE Trans. Inf. Theory 2018Readers: Everyone
Abstract: Over the past few years, a family of interesting new inequalities for the entropies of sums and differences of random variables has been developed by Ruzsa, Tao, and others, motivated by analogous results in additive combinatorics. This paper extends these earlier results to the case of random variables taking values in Rn or, more generally, in arbitrary locally compact and Polish abelian groups. We isolate and study a key quantity, the Ruzsa divergence between two probability distributions, and we show that its properties can be used to extend the earlier inequalities to the present general setting. The new results established include several variations on the theme that the entropies of the sum and the difference of two independent random variables severely constrain each other. Although the setting is quite general, the results are already of interest (and new) for random vectors in Rn. In that special case, we discuss quantitative bounds for the stability of the equality conditions in the entropy power inequality, a reverse entropy power inequality for log-concave random vectors, an information-theoretic analog of the Rogers-Shephard inequality for convex bodies, and consequences of some of our results to determinant inequalities for sums of positive-definite matrices. Moreover, by considering various multiplicative subgroups of the complex plane, one obtains new inequalities for the differential entropies of products and ratios of nonzero, complex-valued random variables.
0 Replies

Loading