Toggle navigation
OpenReview
.net
Login
×
Go to
AISTATS 2020
homepage
Optimal Algorithms for Multiplayer Multi-Armed Bandits
Po-An Wang
,
Alexandre Proutière
,
Kaito Ariu
,
Yassir Jedra
,
Alessio Russo
2020 (modified: 25 Apr 2023)
AISTATS 2020
Readers:
Everyone
Abstract:
The paper addresses various Multiplayer Multi-Armed Bandit (MMAB) problems, where M decision-makers, or players, collaborate to maximize their cumulative reward. We first investigate the MMAB probl...
0 Replies
Loading