Multi-armed Bandit Problems with HistoryDownload PDFOpen Website

2012 (modified: 24 Feb 2025)AISTATS 2012Readers: Everyone
Abstract: In this paper we consider the stochastic multi-armed bandit problem. However, unlike in the conventional version of this problem, we do not assume that the algorithm starts from scratch. Many appli...
0 Replies

Loading