- TL;DR: We introduce an efficient memory layer that can learn representation and coarsen input graphs simultaneously without relying on message passing.
- Abstract: Graph Neural Networks (GNNs) are a class of deep models that operates on data with arbitrary topology and order-invariant structure represented as graphs. We introduce an efficient memory layer for GNNs that can learn to jointly perform graph representation learning and graph pooling. We also introduce two new networks based on our memory layer: Memory-Based Graph Neural Network (MemGNN) and Graph Memory Network (GMN) that can learn hierarchical graph representations by coarsening the graph throughout the layers of memory. The experimental results demonstrate that the proposed models achieve state-of-the-art results in six out of seven graph classification and regression benchmarks. We also show that the learned representations could correspond to chemical features in the molecule data.
- Keywords: Graph Neural Networks, Memory Networks, Hierarchial Graph Representation Learning
- Original Pdf: pdf