A Novel Framework for Uncertainty-Driven Adaptive Exploration
Keywords: Reinforcement Learning, Adaptive Exploration, Uncertainty-Driven Exploration
TL;DR: A generic adaptive exploration framework that employs uncertainty to alternate between exploration and exploitation in a principled manner.
Abstract: Adaptive exploration methods learn complex policies via alternating between exploration and exploitation. An important question
for such methods is to determine the appropriate moment to switch between exploration and exploitation and vice versa. This is critical in domains that require the learning of long and complex sequences of actions. In this work, we present a generic adaptive exploration
framework that employs uncertainty to address this important issue in a principled manner. Our framework includes previous adaptive
exploration approaches as special cases. Moreover, it can incorporate any uncertainty-measuring mechanism of choice, such as
mechanisms used in intrinsic motivation, or epistemic uncertainty-based exploration methods; and is experimentally shown to give
rise to adaptive exploration strategies that outperform standard ones across several MuJoCo environments. Moreover, we showcase
its potential for utilization in safety-critical domains
Area: Learning and Adaptation (LEARN)
Generative A I: I acknowledge that I have read and will follow this policy.
Submission Number: 187
Loading