Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
A Formal Framework to Characterize Interpretability of Procedures
Amit Dhurandhar, Vijay Iyengar, Ronny Luss, Karthikeyan Shanmugam
Jun 11, 2017 (modified: Jun 19, 2017)ICML 2017 WHI Submissionreaders: everyone
Abstract:We provide a novel notion of what it means to be interpretable, looking past the usual association with human understanding. Our key insight is that interpretability is not an absolute concept and so we define it relative to a target model, which may or may not be a human. We define a framework that allows for comparing interpretable procedures by linking it to important practical aspects such as accuracy and robustness. We characterize many of the current state-of-the-art interpretable methods in our framework portraying its general applicability.
TL;DR:We provide formal definitions to measure/characterize interpretability
Enter your feedback below and we'll get back to you as soon as possible.