MABWISER: Parallelizable Contextual Multi-armed BanditsOpen Website

2021 (modified: 21 Apr 2022)Int. J. Artif. Intell. Tools 2021Readers: Everyone
Abstract: Contextual multi-armed bandit algorithms are an effective approach for online sequential decision-making problems. However, there are limited tools available to support their adoption in the commun...
0 Replies

Loading