Relating description complexity to entropy

Published: 01 Jan 2025, Last Modified: 25 Jul 2025J. Comput. Syst. Sci. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We demonstrate novel links between entropy and description complexity, a notion referring to the minimal formula length for specifying given properties. Let PLC denote propositional logic with the ability to count assignments, and let PLC1<math><msup is="true"><mrow is="true"><mi mathvariant="normal" is="true">PLC</mi></mrow><mrow is="true"><mn is="true">1</mn></mrow></msup></math> be the fragment that counts only to one, essentially quantifying assignments. In the finite, PLC1<math><msup is="true"><mrow is="true"><mi mathvariant="normal" is="true">PLC</mi></mrow><mrow is="true"><mn is="true">1</mn></mrow></msup></math> is expressively complete for specifying sets of variable assignments, while PLC is expressively complete for multisets. We show that for both logics, the model classes with maximal Boltzmann entropy are the ones with maximal description complexity. Concerning PLC, we show that expected Boltzmann entropy is asymptotically equivalent to expected description complexity multiplied by the number of proposition symbols considered. For contrast, we prove this link breaks for first-order logic over vocabularies with higher-arity relations. Our results relate to links between Kolmogorov complexity and entropy, providing analogous results in the logic-based scenario with relational structures classified by formulas of different sizes.
Loading