Learning from Multi-Table Relational Data with the Relational Graph Perceiver

Published: 18 Nov 2025, Last Modified: 18 Nov 2025AITD@EurIPS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Submission Type: Short paper (4 pages)
Keywords: relational deep learning, relational graph transformers, graph transformers, relational databases
Abstract: Relational data in domains such as healthcare, finance, and e-commerce capture complex, time-evolving interactions among diverse entities. Models operating on such data must integrate long-range spatial and temporal dependencies while supporting multiple predictive tasks. However, existing relational graph models largely focus on spatial structure, treating time as a constraint rather than a modeling signal, and are typically designed for single-task prediction. We introduce the Relational Graph Perceiver (RGP), a transformer architecture with a Perceiver-style latent bottleneck that integrates signals from diverse node and edge types into a shared latent space for global relational reasoning. RGP also includes a flexible cross-attention decoder for joint multi-task learning across disjoint label spaces and a temporal subgraph sampler that enhances context by retrieving time-relevant nodes beyond local neighborhoods. Experiments on RelBench, SALT, and CTU show that RGP delivers state-of-the-art performance, offering a general and scalable solution for relational deep learning.
Published Paper Link: https://openreview.net/pdf?id=fcVIJ2WSIX
Relevance Comments: This work aligns closely with the workshop’s focus on AI for tabular and relational data. RGP advances representation learning for relational databases by formulating them as heterogeneous temporal graphs and enabling scalable, multi-task learning across diverse label spaces. By integrating ideas from graph learning, structured data modeling, and foundation architectures, it contributes to the broader goal of developing general-purpose models for tabular and relational modalities.
Published Venue And Year: LOG 2025
Submission Number: 49
Loading