MDPs as Distribution Transformers: Affine Invariant Synthesis for Safety Objectives

Published: 01 Jan 2023, Last Modified: 30 Sept 2024CAV (3) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Markov decision processes can be viewed as transformers of probability distributions. While this view is useful from a practical standpoint to reason about trajectories of distributions, basic reachability and safety problems are known to be computationally intractable (i.e., Skolem-hard) to solve in such models. Further, we show that even for simple examples of MDPs, strategies for safety objectives over distributions can require infinite memory and randomization.
Loading