GRAPH GENERATIVE PRE-TRAINED TRANSFORMER

Published: 06 Mar 2025, Last Modified: 05 Apr 2025ICLR 2025 DeLTa Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: long paper (up to 8 pages)
Keywords: Graph generation, Foundation Models, GPT
TL;DR: A GPT model for graph generation
Abstract: Graph generation is an essential task across various domains, such as molecular design and social network analysis, as it enables the modeling of complex relationships and structured data. While many modern graph generative models rely on adjacency matrices, this work revisits an approach that represents graphs as sequences of node and edge sets. We argue that this method offers more efficient graph encoding and then devise a method representing graphs as token sequences. Leveraging this representation, we present the Graph Generative Pre-trained Transformer (G2PT), an auto-regressive model designed to learn graph structures through next-token prediction. To extend G2PT's utility as a general-purpose foundation model, we explore fine-tuning techniques for two downstream tasks: goal-oriented generation and graph property prediction. Comprehensive experiments across multiple datasets demonstrate that G2PT delivers state-of-the-art generative performance on both generic graph and molecular datasets. Moreover, G2PT showcases strong adaptability and versatility in downstream applications, ranging from molecular design to property prediction.
Submission Number: 72
Loading