Keywords: computational rationality, user modeling, probabilistic modeling, human-AI interaction
TL;DR: We introduce a method for efficient online inference with advanced and computationally costly cognitive models, enabling use of modern machine learning pipelines also in the ubiquitous cases with humans in the loop.
Abstract: Probabilistic user modeling is essential for building machine learning systems in the ubiquitous cases with humans in the loop. However, modern advanced user models, often designed as cognitive behavior simulators, are incompatible with modern machine learning pipelines and computationally prohibitive for most practical applications. We address this problem by introducing widely-applicable differentiable surrogates for bypassing this computational bottleneck; the surrogates enable computationally efficient inference with modern cognitive models. We show experimentally that modeling capabilities comparable to the only available solution, existing likelihood-free inference methods, are achievable with a computational cost suitable for online applications. Finally, we demonstrate how AI-assistants can now use cognitive models for online interaction in a menu-search task, which has so far required hours of computation during interaction.
Supplementary Material: pdf
Other Supplementary Material: zip