With Great Power Comes Great Adaptation: Message Tuning Outshines Prompt Tuning for Graph Foundation Models

ICLR 2026 Conference Submission17816 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Foundation Models, Message Tuning, Prompt Tuning, Prismatic Space Theory, Adaptation Capacity.
TL;DR: We propose Message Tuning for Graph Foundation Models (MTG), a lightweight adaptation method that outshines prompt tuning, supported by a novel theoretical framework (Prismatic Space Theory) demonstrating its superior adaptation capacity.
Abstract: Graph foundation models (GFMs), built upon the “Pre-training and Adaptation” paradigm, have emerged as a promising path toward artificial general intelligence on graphs. Despite the remarkable potential of large language models, most existing GFMs still adopt Graph Neural Networks as their backbone. For such GNN-based GFMs, prompt tuning has become the prevailing adaptation method for downstream tasks. However, while recent theoretical research has revealed why graph prompt tuning works, how to measure its adaptation capacity remains an open problem. In this paper, we propose Prismatic Space Theory (PS-Theory) to quantify the capacity of adaptation approaches and establish the upper bound for the adaptation capacity of prompt tuning. Inspired by prefix-tuning, we introduce Message Tuning for GFMs (MTG), a lightweight approach that injects a small set of learnable message prototypes into each layer of the GNN backbone to adaptively guide message fusion without updating the frozen pre-trained weights. Through our PS-Theory, we rigorously prove that MTG has greater adaptation capacity than prompt tuning. Extensive experiments demonstrate that MTG consistently outperforms prompt tuning baselines across diverse benchmarks, validating our theoretical findings. Our code is available at https://anonymous.4open.science/r/MTG.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 17816
Loading