2021 (modified: 25 Jan 2023)ALT 2021Readers: Everyone
Abstract:Bandits with covariates, a.k.a. \emph{contextual bandits}, address situations where optimal actions (or arms) at a given time $t$, depend on a \emph{context} $x_t$, e.g., a new patient’s medical hi...