Learning Hierarchical World Models with Adaptive Temporal Abstractions from Discrete Latent DynamicsDownload PDF

Published: 20 Jul 2023, Last Modified: 31 Aug 2023EWRL16Readers: Everyone
Keywords: world models, temporal abstraction, model-based reinforcement learning, hierarchical planning
TL;DR: We propose an algorithm to learn a hierarchy of world models from sparse latent state changes for explainable, long-horizon planning.
Abstract: Hierarchical world models have the potential to significantly improve model-based reinforcement learning (MBRL) and planning by enabling reasoning across multiple time scales. Nonetheless, the majority of state-of-the-art MBRL methods still employ flat, non-hierarchical models. The challenge lies in learning suitable hierarchical abstractions. We propose Temporal Hierarchies from Invariant Context Kernels (THICK), an algorithm that learns a world model hierarchy based on discrete latent dynamics. The lower level of the THICK world model selectively updates parts of its latent state sparsely in time, forming invariant contexts. The higher level is trained exclusively to predict situations involving these sparse context state changes. Our experiments demonstrate that THICK learns categorical, interpretable, temporal abstractions on the high level while maintaining precise low-level predictions. Furthermore, we show that the developing hierarchical predictive model can seamlessly enhance the abilities of MBRL or planning methods. We believe that THICK-like, hierarchical world models will be key for developing more sophisticated agents capable of exploring, planning, and reasoning about the future across multiple time scales.
1 Reply

Loading