Keywords: Feature Importance, Explainable Artificial Intelligence, Multi-class classification
TL;DR: A new feature importance measure for multi-class classification quantifying importance in separating a pair of classes
Abstract: Feature importance is one of the most prominent methods in explainable artificial intelligence. It seeks to score the features an artificial intelligence model relies on the most. In multi-class classification, current methods fail to explain inter-class relationships as they either provide explanations for binary classification only, or suffer from aggregation bias. In a multi-class classification scenario, features may carry discriminative power to separate some of the classes while being otherwise less relevant. State-of-the-art feature importance measures do not capture this behavior. We propose Inter-Class Feature Importance (ICFI), a measure that scores the feature importance to discriminate between an arbitrary pair of classes. ICFI is a post-hoc, model-agnostic method, independent from the machine learning architecture employed. ICFI marginalises the target output with respect to the feature of interest, leveraging the resulting change in model behavior to quantify feature importance. We present ICFI’s properties and argue its relevance, describing use cases and showing insights gained. We demonstrate through thorough experiments on real-world datasets how ICFI captures the features characteristics for specific class relationships.
Primary Area: interpretability and explainable AI
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6159
Loading