Toggle navigation
OpenReview
.net
Login
×
Go to
IJAIT 2021
homepage
MABWISER: Parallelizable Contextual Multi-armed Bandits
Emily Strong
,
Bernard Kleynhans
,
Serdar Kadioglu
2021 (modified: 21 Apr 2022)
Int. J. Artif. Intell. Tools 2021
Readers:
Everyone
Abstract:
Contextual multi-armed bandit algorithms are an effective approach for online sequential decision-making problems. However, there are limited tools available to support their adoption in the commun...
0 Replies
Loading