A Precompute-Then-Adapt Approach for Efficient Graph Condensation

26 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: graph, condensation
Abstract: Graph Neural Networks (GNNs) have shown great success in leveraging complex relationships in data but face significant computational challenges when dealing with large-scale graphs. To tackle this issue, graph condensation methods aim to compress large graphs into smaller, synthetic ones that can be efficiently used for GNN training. Recent approaches, particularly those based on trajectory matching, have achieved state-of-the-art (SOTA) performance in graph condensation tasks. Trajectory-based techniques match the training behavior on a condensed graph closely with that on the original graph, typically by guiding the trajectory of model parameters during training. However, these methods require repetitive re-training of GNNs during the condensation process, making them impractical for large graphs due to their high computational cost, \eg, taking up to 22 days to condense million-node graphs. In this paper, we propose a novel Precompute-then-Adapt graph condensation framework that overcomes this limitation by separating the condensation process into a one-time precomputation stage and a one-time adaptation learning stage. Remarkably, even with only the precomputation stage, which typically takes seconds, our method surpasses or matches SOTA results on 3 out of 7 benchmark datasets. Extensive experiments demonstrate that our approach achieves better or comparable accuracy while being 96× to 2,455× faster in condensation time compared to SOTA methods, significantly enhancing the practicality of GNNs for large-scale graph applications. Our code and data are available at \url{https://anonymous.4open.science/r/GCPA-F6F9/}.
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5832
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview