2019 (modified: 05 Nov 2022)ICML 2019Readers: Everyone
Abstract:While the objective in traditional multi-armed bandit problems is to find the arm with the highest mean, in many settings, finding an arm that best captures information about other arms is of inter...