GCSGNN: Towards Global Counterfactual-Based Self-Explainable Graph Neural Networks

ICLR 2026 Conference Submission14801 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, Self-explainability, Graph Counterfactual Explanations
Abstract: Graph Neural Networks (GNNs) exhibit superior performance in various graph-based tasks, ranging from scene graph generation to drug discovery. However, they operate as black-box models due to the lack of access to their rationale for a specific prediction. To enhance the transparency of GNNs, graph counterfactual explanation (GCE) identifies the minimal modifications to the input graph that cause the GNN to change its prediction to a different class. Current GCE methods face two major challenges: (1) they adopt a post-hoc explanation paradigm by separately training an explainer model for a trained GNN. This sequential optimization process yields suboptimal explanations since the GNN training process is not exposed to the explainer. (2) Current methods are primarily local-level approaches, which means that they generate explanations for each input sample individually. As a result, they cannot capture the shared prediction rationales that generalize across the entire input data distribution. To address these two challenges, we propose a novel Global Counterfactual-based Self-explainable GNN (GCSGNN) framework. GCSGNN can simultaneously act as a GNN, providing predictions on input samples, and an explainer, generating explanations for its predictions. Furthermore, GCSGNN is trained to identify common patterns in the GNN embeddings across input samples, enabling it to learn global (i.e., model-level) explanations. Extensive qualitative and quantitative analysis across various datasets demonstrates that our GCSGNN achieves outstanding performance against the baseline methods. Our code can be found at https://anonymous.4open.science/r/gcsgnn.
Primary Area: interpretability and explainable AI
Submission Number: 14801
Loading