Fast Node Embeddings: Learning Ego-Centric Representations

Anonymous

Nov 03, 2017 (modified: Dec 15, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Representation learning is one of the foundations of Deep Learning and allowed important improvements on several Machine Learning tasks, such as Neural Machine Translation, Question Answering and Speech Recognition. Recent works have proposed new methods for learning representations for nodes and edges in graphs. Several of these methods are based on the SkipGram algorithm, and they usually process a large number of multi-hop neighbors in order to produce the context from which node representations are learned. In this paper, we propose an effective and also efficient method for generating node embeddings in graphs that employs a restricted number of permutations over the immediate neighborhood of a node as context to generate its representation, thus ego-centric representations. We present a thorough evaluation showing that our method outperforms state-of-the-art methods in six different datasets related to the problems of link prediction and node classification, being one to three orders of magnitude faster than baselines when generating node embeddings for very large graphs.
  • TL;DR: A faster method for generating node embeddings that employs a number of permutations over a node's immediate neighborhood as context to generate its representation.
  • Keywords: Graph, Node Embeddings, Distributed Representations, Learning Representations

Loading