Struct-G: Structural-Aware Pretraining for Graph and Task Transfer Learning

ICLR 2026 Conference Submission14714 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph, Pre-training, multi-task, transfer learning
TL;DR: A graph-structure based pre-training model with good results on graph tasks.
Abstract: Transfer learning has revolutionized domains like vision and language by enabling pretrained models to adapt rapidly with minimal supervision. However, applying transfer learning to graph‑structured data faces unique challenges: graphs exhibit diverse topology, sparse or heterogeneous node attributes, and lack consistent semantics across datasets, making it difficult to learn representations that generalize across domains. Recent graph pretraining efforts including generative methods and contrastive objectives have shown promise but often rely on complex architectures, rich feature modalities, or heavy computation, limiting their applicability to structure‑only graphs and resource‑constrained settings. To address these challenges, we introduce Struct-G, a lightweight pretraining framework that decouples global topology capture from local feature refinement. Struct-G first computes shallow random‑walk–based structural embeddings, then fuses them with raw attributes via an adaptive, feature‑wise gating network and a shared message‑passing backbone. By jointly optimizing multiple self‑supervised objectives such as link prediction, node classification, feature reconstruction, and structural alignment, Struct-G learns robust node embeddings that transfer effectively with minimal fine‑tuning. Our extensive experiment results demonstrate that explicit structural inductive bias and self-supervised multi-task learning provide a scalable and accessible foundation for graph representation learning.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 14714
Loading