G-Local Attention Graph Pooling for Graph Classification

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Graph neural networks, graph pooling, pooling layer, data augmentation
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a new GNN pooling layer considering both global and local structural properties of a graph.
Abstract: Graph pooling is an essential operation in Graph Neural Networks that reduces the size of an input graph while preserving its core structural properties. This compression operation improves the learned representation of the graph, yielding to a performance boost on downstream tasks. Existing pooling methods find a compressed representation considering the Global Topological Structures (e.g., cliques, stars, clusters) or Local information at node level (e.g., top-$k$ informative nodes). However, there is a lack of an effective graph pooling method that integrates both Global and Local properties of the graph. To this end, we propose a two-channel Global-Local Attention Pooling (GLA-Pool) layer that exploits the aforementioned graph properties, generating more robust graph representations. The GLA-Pool can be integrated into any GNN-based architectures. Further, we propose a smart data augmentation technique to enrich small-scale datasets. Exhaustive experiments on eight publicly available graph classification benchmarks, under standard metrics, show that GLA-Pool significantly outperforms thirteen state-of-the-art models on six datasets while being on par for the remaining two. The code will be available at this link.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8341
Loading