Improved Feature Importance Computation for Tree Models Based on the Banzhaf ValueDownload PDF

Published: 20 May 2022, Last Modified: 05 May 2023UAI 2022 PosterReaders: Everyone
Abstract: The Shapley value -- a fundamental game-theoretic solution concept -- has recently become one of the main tools used to explain predictions of tree ensemble models. Another well-known game-theoretic solution concept is the Banzhaf value. Although the Banzhaf value is closely related to the Shapley value, its properties w.r.t. feature attribution have not been understood equally well. This paper shows that, for tree ensemble models, the Banzhaf value offers some crucial advantages over the Shapley value while providing similar feature attributions. In particular, we first give an optimal $O(TL + n)$ time algorithm for computing the Banzhaf value-based attribution of a tree ensemble model's output. Here, $T$ is the number of trees, $L$ is the maximum number of leaves in a tree, and $n$ is the number of features. In comparison, the state-of-the-art Shapley value-based algorithm runs in $O(TLD^2 + n)$ time, where $D$ denotes the maximum depth of a tree in the ensemble. Next, we experimentally compare the Banzhaf and Shapley values for tree ensemble models. Both methods deliver essentially the same average importance scores for the studied datasets using two different tree ensemble models (the sklearn implementation of Decision Trees or xgboost implementation of Gradient Boosting Decision Trees). However, our results indicate that, on top of being computable faster, the Banzhaf is more numerically robust than the Shapley value.
Supplementary Material: zip
5 Replies

Loading