Logical Distillation of Graph Neural Networks

Published: 24 Jun 2024, Last Modified: 31 Jul 2024ICML 2024 MI Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, C2, First-Order Logic, Model Distillation
Abstract: We distill a symbolic model from a Graph Neural Network (GNN). Recent results have shown connections between the expressivity of GNNs and the two-variable fragment of first-order logic with counting quantifiers C2. We use decision trees to represent formulas in an extension of C2 and present an algorithm to distill such decision trees from a given GNN model. We test our approach on multiple GNN architectures. The distilled models are interpretable, succinct, and attain similar accuracy to the underlying GNN. Furthermore, when the ground truth is expressible in C2, our approach outperforms the GNN.
Submission Number: 42
Loading