On the Gaussianity of Kolmogorov Complexity of Mixing Sequences
Abstract: Let K(X_1, \ldots, X_n) and H(X_n | X_{n-1}, \ldots, X_1) denote the Kolmogorov complexity and Shannon's entropy rate of a stationary and ergodic process \{X_i\}_{i=-\infty}^\infty. It has been proved that \frac{K(X_1, \ldots, X_n)}{n} - H(X_n | X_{n-1}, \ldots, X_1) \rightarrow 0, almost surely. This paper studies the convergence rate of this asymptotic result. In particular, we show that if the process satisfies certain mixing conditions, then there exists \sigma<\infty such that \sqrt{n}\left(\frac{K(X_{1:n})}{n}- H(X_0|X_1,\dots,X_{-\infty})\right) \rightarrow_d N(0,\sigma^2). Furthermore, we show that under slightly stronger mixing conditions one may obtain non-asymptotic concentration bounds for the Kolmogorov complexity.
Loading