Graph layouts and graph contrastive learning via neighbour embeddings

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Graph Layout, Contrastive Learning, t-SNE
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: t-SNE and Contrastive Neighbor Embeddings can be used to learn state-of-the-art graph representations while being conceptually simpler than most other methods.
Abstract: In node-level graph representation learning, there are two distinct paradigms. One is known as graph layouts, where nodes are embedded into 2D space for visualization purposes. Another is graph contrastive learning, where nodes are parametrically embedded into a high-dimensional vector space based on node features. In this work, we show that these two paradigms are intimately related, and that both can be successfully approached via neighbour embedding methods. First, we introduce graph t-SNE for two-dimensional graph drawing, and show that the resulting layouts outperform all existing algorithms in terms of local structure preservation, as measured by kNN classification accuracy. Second, we introduce graph contrastive neighbor embedding (graph CNE)}, which uses a fully-connected neural network to transform graph node features into an embedding space by optimizing the contrastive InfoNCE objective. We show that graph CNE, while being conceptually simpler than most existing graph contrastive learning methods, produces competitive node representations, with state-of-the-art linear classification accuracy.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5711
Loading