On the Equivalence between Positional Node Embeddings and Structural Graph RepresentationsDownload PDF

25 Sept 2019, 19:26 (modified: 22 Sept 2020, 01:12)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Original Pdf: pdf
Code: https://github.com/PurdueMINDS/Equivalence
TL;DR: We develop the foundations of a unifying theoretical framework connecting node embeddings and structural graph representations through invariant theory
Abstract: This work provides the first unifying theoretical framework for node (positional) embeddings and structural graph representations, bridging methods like matrix factorization and graph neural networks. Using invariant theory, we show that relationship between structural representations and node embeddings is analogous to that of a distribution and its samples. We prove that all tasks that can be performed by node embeddings can also be performed by structural representations and vice-versa. We also show that the concept of transductive and inductive learning is unrelated to node embeddings and graph representations, clearing another source of confusion in the literature. Finally, we introduce new practical guidelines to generating and using node embeddings, which further augments standard operating procedures used today.
Keywords: Graph Neural Networks, Structural Graph Representations, Node Embeddings, Relational Learning, Invariant Theory, Theory, Deep Learning, Representational Power, Graph Isomorphism
17 Replies

Loading