Resource Allocation in Multi-armed Bandit Exploration: Overcoming Sublinear Scaling with Adaptive ParallelismDownload PDFOpen Website

2021 (modified: 16 May 2022)ICML 2021Readers: Everyone
Abstract: We study exploration in stochastic multi-armed bandits when we have access to a divisible resource that can be allocated in varying amounts to arm pulls. We focus in particular on the allocation of...
0 Replies

Loading