RelGNN: Composite Message Passing for Relational Deep Learning

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: RelGNN is a novel GNN framework that introduces composite message passing and graph attention mechanisms based on automatically derived "atomic routes" to efficiently model relational data, achieving state-of-the-art performance on Relbench.
Abstract: Predictive tasks on relational databases are critical in real-world applications spanning e-commerce, healthcare, and social media. To address these tasks effectively, Relational Deep Learning (RDL) encodes relational data as graphs, enabling Graph Neural Networks (GNNs) to exploit relational structures for improved predictions. However, existing RDL methods often overlook the intrinsic structural properties of the graphs built from relational databases, leading to modeling inefficiencies, particularly in handling many-to-many relationships. Here we introduce RelGNN, a novel GNN framework specifically designed to leverage the unique structural characteristics of the graphs built from relational databases. At the core of our approach is the introduction of atomic routes, which are simple paths that enable direct single-hop interactions between the source and destination nodes. Building upon these atomic routes, RelGNN designs new composite message passing and graph attention mechanisms that reduce redundancy, highlight key signals, and enhance predictive accuracy. RelGNN is evaluated on 30 diverse real-world tasks from Relbench (Fey et al., 2024), and achieves state-of-the-art performance on the vast majority of tasks, with improvements of up to 25%.
Lay Summary: Predictive tasks in areas like e-commerce, healthcare, and social media rely on effectively understanding and modeling relational databases. Relational Deep Learning (RDL) addresses these tasks by encoding relational data as graphs, allowing Graph Neural Networks (GNNs) to exploit the relational structures for improved predictions. However, existing RDL methods often overlook the intrinsic structural properties of these graphs, leading to modeling inefficiencies. Here we introduce RelGNN, a novel GNN framework specifically designed to leverage these unique structural characteristics of graphs built from relational databases. RelGNN introduces the concept of "atomic routes," which are simple paths that enable single-hop interactions between the source and destination nodes. Building upon these atomic routes, RelGNN designs new composite message passing and graph attention mechanisms that enhance predictive accuracy. RELGNN achieves state-of-the-art performance on the vast majority of 30 diverse real-world tasks from Relbench (Fey et al., 2024), with improvements of up to 25%.
Link To Code: https://github.com/snap-stanford/RelGNN
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph Neural Networks, Relational Deep Learning
Submission Number: 431
Loading