Explicit Renyi Entropy for Hidden Markov Models

Published: 01 Jan 2020, Last Modified: 03 Jul 2024ISIT 2020EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Determining entropy rates of stochastic processes is a fundamental but difficult problem, with closed-form solutions known only for specific cases. This paper pushes the state-of-the-art by solving the problem for Hidden Markov Models (HMMs) and Renyi entropies. While computation of Renyi entropy for Markov chains reduces to studying the growth of a simple matrix product, computations for HMMs involve products of random matrices. As a result, this case is much harder and no explicit formulas have been known so far. In the finite-sample regime we circumvent this issue for Renyi entropy of integer orders, reducing the problem again to single matrix products where the matrix is built from transition and emission probabilities by means of tensor products. To obtain results in the asymptotic setting, we use a novel technique for determining the growth of non-negative matrix powers. The classical approach - Frobenius-Perron theory - requires positivity assumptions; we instead work directly with the spectral formula. As a consequence, our results do not suffer from limitations such as irreducibility and aperiodicity. This improves our understanding of the entropy rate even for standard (unhidden) chains. A recently published side-channel attack against RSA was proven effective using our result.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview