SNAPE: A Sequential Non-Stationary Decision Process Model for Adaptive Explanation Generation

Published: 01 Jan 2023, Last Modified: 29 Aug 2024ICAART (1) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The automatic generation of explanations is an increasingly important problem in the field of Explainable AI (XAI). However, while most work looks at how complete and correct information can be extracted or how it can be presented, the success of an explanation also depends on the person the explanation is targeted at. We present an adaptive explainer model that constructs and employs a partner model to tailor explanations during the course of the interaction. The model incorporates different linguistic levels of human-like explanations in a hierarchical, sequential decision process within a non-stationary environment. The model is based on online planning (using Monte Carlo Tree Search) to solve a continuously adapted MDP for explanation action and explanation move selection. We present the model as well as first results from explanation interactions with different kinds of simulated users.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview