Keywords: graph neural network, explainer, molecular graph, ante-hoc
Abstract: Explaining the decisions made by machine learning models for high-stakes applications is critical for transparency. This is particularly true in the case of models for graphs, where decisions depend on complex patterns combining structural and attribute data. We propose EAGER (Effective Ante-hoc Graph Explainer), a novel and flexible ante-hoc explainer designed to discover explanations for graph neural networks, with a focus on the chemical domain. As an ante-hoc model, EAGER inductively learn a graph predictive model and the associating explainer together. We employ a novel bilevel iterative training process based on optimizing the Information Bottleneck principle, effectively distilling the most useful substructures while discarding irrelevant details. As a result, EAGER can identify molecular substructures that contain the necessary and precise information needed for prediction. Our experiments on various molecular classification tasks show that EAGER explanations are better than existing post-hoc and ante-hoc approaches.
Primary Area: interpretability and explainable AI
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12974
Loading