On Interruptible Pure Exploration in Multi-Armed BanditsOpen Website

2015 (modified: 16 Jul 2019)AAAI 2015Readers: Everyone
Abstract: Interruptible pure exploration in multi-armed bandits (MABs) is a key component of Monte-Carlo tree search algorithms for sequential decision problems. We introduce Discriminative Bucketing (DB), a novel family of strategies for pure exploration in MABs, which allows for adapting recent advances in non-interruptible strategies to the interruptible setting, while guaranteeing exponential-rate performance improvement over time. Our experimental evaluation demonstrates that the corresponding instances of DB favorably compete both with the currently popular strategies UCB1 and e-Greedy, as well as with the conservative uniform sampling.
0 Replies

Loading