On the Equivalence between Positional Node Embeddings and Structural Graph RepresentationsDownload PDF

Published: 20 Dec 2019, Last Modified: 05 May 2023ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We develop the foundations of a unifying theoretical framework connecting node embeddings and structural graph representations through invariant theory
Abstract: This work provides the first unifying theoretical framework for node (positional) embeddings and structural graph representations, bridging methods like matrix factorization and graph neural networks. Using invariant theory, we show that relationship between structural representations and node embeddings is analogous to that of a distribution and its samples. We prove that all tasks that can be performed by node embeddings can also be performed by structural representations and vice-versa. We also show that the concept of transductive and inductive learning is unrelated to node embeddings and graph representations, clearing another source of confusion in the literature. Finally, we introduce new practical guidelines to generating and using node embeddings, which further augments standard operating procedures used today.
Keywords: Graph Neural Networks, Structural Graph Representations, Node Embeddings, Relational Learning, Invariant Theory, Theory, Deep Learning, Representational Power, Graph Isomorphism
Code: https://github.com/PurdueMINDS/Equivalence
Original Pdf: pdf
17 Replies

Loading