Grammatical Path Network : You want cycles, paths is all you need.

Published: 16 Nov 2024, Last Modified: 26 Nov 2024LoG 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Network, Graph Learning
TL;DR: This work introduces the Grammatical Path Network, a novel Graph Neural Network that efficiently captures cycles in graph structures using Context-Free Grammar, achieving high performance on cycle-related tasks without explicit cycle precomputation.
Abstract:

In this work, we address the challenge of learning from structured data by proposing the Grammatical Path Network (GPN), a novel Graph Neural Network (GNN) designed to efficiently capture cycles in graph structures. Building on recent advancements in GNN expressiveness and substructure counting, GPN combines methodologies from Graph Substructure Networks (GSN) and a framework that translates Context Free Grammars (CFG) into GNNs. The key innovation lies in using CFG to count cycles of length $l+1$ by precomputing paths of length $l$ at the edge level, maintaining the computational complexity of standard Message Passing Neural Networks (MPNNs). Our experiments demonstrate that GPN achieves comparable performance to GSN on datasets requiring cycle information, without the need for explicit cycle precomputation. This approach offers a promising direction for developing efficient and expressive GNNs for structured data analysis.

Supplementary Materials: zip
Submission Type: Extended abstract (max 4 main pages).
Poster: png
Submission Number: 100
Loading