Count-GNN: Graph Neural Networks for Subgraph Isomorphism CountingDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Abstract: The prevalence of graph structures has attracted a surge of research interest in graph data. As many graph-based tasks exploit recurring subgraph patterns on graphs, subgraph isomorphism counting becomes an important problem. Classical methods usually boil down to a backtracking framework that needs to navigate a huge search space with prohibitive computational cost due to the NP-completeness of the problem. Some recent studies resort to graph neural networks (GNNs) to learn a low-dimensional representation for both the query subgraph and the input graph, in order to predict the number of query subgraph isomorphisms on the input graph. However, typical GNNs employ a node-centric message passing mechanism that receives and aggregates messages on nodes. While effective on node-oriented tasks, they become inadequate in complex structure matching for isomorphism counting. Moreover, given an input graph, the space of query subgraph is enormous, and thus expecting a single model to fit the diverse range of query subgraphs is unrealistic. In this paper, we propose a novel GNN called Count-GNN for subgraph isomorphic counting, to deal with the above challenges at two levels. At the edge level, we resort to an edge-centric message passing scheme, where messages on edges are propagated and aggregated based on the edge adjacency. By treating edges as first-class citizens, Count-GNN is able to preserve finer-grained structural information, given that an edge is an atomic unit of encoding graph structures. At the graph level, we modulate the graph representation conditioned on the query subgraph, so that the model can be adapted to each unique query for better matching with the input graph. To demonstrate the effectiveness and efficiency of Count-GNN, we conduct extensive experiments on a number of benchmark graphs. Results show that Count-GNN achieves superior performance in comparison to the state-of-the-art baselines.
Supplementary Material: zip
14 Replies

Loading