GraphPFN: A Prior-Data Fitted Graph Foundation Model

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph foundation models, tabular foundation models, LimiX, graph neural network, graph machine learning
Abstract: Graph foundation models face several fundamental challenges including transferability across datasets and data scarcity, which calls into question the feasibility of graph foundation models at all. However, despite similar challenges, the tabular domain has recently witnessed the emergence of the first successful foundation models such as TabPFNv2 or LimiX. Many of these models are based on the prior-data fitted networks (PFN) framework, in which models are pretrained on carefully designed synthetic datasets to make predictions in an in-context learning regime. Recently, G2T-FM has made the first step towards adopting PFNs for graph tasks, yet it is limited to hand-crafted features and was never pretrained on graph data. In this work, we make the next step by proposing GraphPFN, a PFN-based model designed and pretrained specifically for graphs. Following the PFN framework, we first design a prior distribution of synthetic attributed graphs by using a novel combination of multiple stochastic block models and a preferential attachment process for structure generation and graph-aware structured causal models for attribute generation. Then, we augment the tabular foundation model LimiX with attention-based graph neighborhood aggregation layers and train it on synthetic graphs sampled from our prior. On diverse real-world graph datasets with up to $50{,}000$ nodes, GraphPFN shows strong in-context learning performance and achieves state-of-the-art results after finetuning, outperforming both G2T-FM and task-specific GNNs trained from scratch on most datasets. More broadly, we hope that GraphPFN shows the potential of PFN-based models for building graph foundation models.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 12060
Loading