An Effective and Efficient Generation Framework for Condensing the Graph Repository

ICLR 2026 Conference Submission15783 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Condensation for the graph repository, Graph Algorithm, Neural Network Model
Abstract: Graph repositories with multiple graphs are increasingly prevalent in various applications. As the amount of data increases, training neural networks on graph repositories becomes increasingly burdensome. However, existing condensation methods focus more on reducing the size of a single graph, they fail to address the challenges of efficiently and effectively compressing multiple data graphs. In this work, we propose a novel end-to-end graph repository condensation framework (GRCOND) that effectively condenses a large-scale graph repository with multiple graphs, while preserving task-relevant structural and feature information. Unlike traditional methods, our approach pretrains a dataset-specific GNN model to create and optimize synthetic graphs so that we can capture both intra-graph structures and inter-graph relationships, enabling a more holistic representation of the repository. Through experiments, our proposed approach consistently delivers higher accuracy and feature retention with different compression ratios, which highlights the potential of our framework to accelerate GNN training and expand the applicability of graph-based machine learning in resource-constrained environments.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 15783
Loading