How Much Can Transfer? BRIDGE: Bounded Multi-Domain Graph Foundation Model with Generalization Guarantees

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: In this paper, we introduce BRIDGE, a bounded gaph foundation model pre-trained on multi-domains with generalization guarantees.
Abstract: Graph Foundation Models hold significant potential for advancing multi-domain graph learning, yet their full capabilities remain largely untapped. Existing works show promising task performance with the “pretrain-then-prompt” paradigm, which lacks theoretical foundations to understand why it works and how much knowledge can be transferred from source domains to the target. In this paper, we introduce BRIDGE, a bounded graph foundation model pre-trained on multi-domains with Generalization guarantees. To learn discriminative source knowledge, we align multi-domain graph features with domain-invariant aligners during pre-training. Then, a lightweight Mixture of Experts (MoE) network is proposed to facilitate downstream prompting through self-supervised selective knowledge assembly and transfer. Further, to determine the maximum amount of transferable knowledge, we derive an optimizable generalization error upper bound from a graph spectral perspective given the Lipschitz continuity. Extensive experiments demonstrate the superiority of BRIDGE on both node and graph classification compared with 15 state-of-the-art baselines.
Lay Summary: Graphs are powerful tools for representing complex systems, such as social networks, biological pathways, and transportation maps. However, building models that can learn from graphs across different domains remains a major challenge. Most current approaches focus on task performance without offering theoretical insights into how well knowledge from one domain can transfer to another. Our work presents BRIDGE, a graph foundation model designed to handle multiple domains while providing formal guarantees on generalization. During pre-training, BRIDGE aligns graph features from diverse domains using domain-invariant representations, helping the model learn shared patterns. To support efficient transfer, we introduce a lightweight Mixture of Experts network that selects and assembles relevant knowledge for new tasks without needing full retraining. Additionally, we derive a theoretical upper bound on the model’s generalization error based on graph spectral theory, offering a principled way to estimate transferability. BRIDGE outperforms 15 state-of-the-art models on both node and graph classification tasks. Our approach not only improves performance but also provides deeper understanding of what makes knowledge transfer effective in graph learning.
Link To Code: https://github.com/RingBDStack/BRIDGE
Primary Area: Deep Learning->Graph Neural Networks
Keywords: graph foundation models, graph multi-domain pre-training, graph prompt learning, cross-domain generalization
Submission Number: 927
Loading