Subgraph-To-Node Translation for Efficient Representation Learning of Subgraphs

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Efficiency, Subgraphs, Graph Neural Networks
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose Subgraph-To-Node translation to efficiently learn representations of subgraphs by coarsely translating subgraphs into nodes.
Abstract: Subgraph representation learning has emerged as an important problem, but it is by default approached with the graph neural networks (GNNs) on a large global graph, an approach that demands extensive memory and computational resources. We argue that resource requirements can be reduced by designing an efficient data structure to store and process subgraphs. In this paper, we propose Subgraph-To-Node (S2N) translation, a novel formulation to learn representations of subgraphs efficiently. Specifically, given a set of subgraphs in the global graph, we construct a new graph by coarsely transforming subgraphs into nodes. We theoretically and empirically show that S2N significantly reduces memory and computational costs compared to using state-of-the-art models with conventional data structures. We also suggest Coarsened S2N (CoS2N), which combines S2N with graph coarsening methods for improved results in a data-scarce setting where there are not sufficient subgraphs to cover the global graph. Our experiments on four real-world benchmarks demonstrate that fined-tuned models with S2N translation can process 183 -- 711 times more subgraph samples than state-of-the-art models at a similar or better performance level.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2557
Loading