Learning Graph Representations via Graph Entropy Maximization

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Graph representation learning, Körner graph entropy, orthonormal representations, chromatic entropy
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We learn the orthonormal representations for graphs via maximizing the Körner graph entropy
Abstract: Graph representation learning aims to represent graphs as vectors that can be utilized in downstream tasks such as graph classification. In this work, we focus on learning diverse representations that can capture the graph information as much as possible. We propose to quantify graph information using graph entropy, where we define a probability distribution of a graph based on its node and global representations. However, computing graph entropy is NP-hard due to the complex vertex packing polytope involved in its definition. We therefore provide an approximation of graph entropy based on the Shannon entropy and the chromatic entropy. By maximizing the approximation of graph entropy through graph neural networks, we obtain informative node and graph representations. Experimental results demonstrate the effectiveness of our method in comparison to baselines in unsupervised learning and semi-supervised learning tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5784
Loading