CTRL: Graph condensation via crafting rational trajectory matching

15 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Dataset distillation
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Training on large-scale graphs has achieved remarkable results in graph representation learning, but its cost and storage have raised growing concerns. Generally, existing graph distillation methods address these issues by employing gradient matching, but these strategies primarily emphasize matching directions of the gradients. We empirically demonstrate this can result in deviations in the matching trajectories and disparities in the frequency distribution. Accordingly, we propose CrafTing RationaL trajectory (CTRL), a novel graph dataset distillation method. CTRL introduces gradient magnitude matching during the gradient matching process by incorporating the Euclidean distance into the criterion. Additionally, to prevent the disregard for the evenness of feature distribution and the lack of variation that the naive random sampling initialization may introduce, we adopt a simple initialization approach that ensures evenly distributed features. CTRL not only achieves state-of-the-art performances in 34 cases of experiments on 12 datasets with lossless performances on 5 datasets but can also be easily integrated into other graph distillation methods based on gradient matching.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 303
Loading