Grassmann Graph EmbeddingDownload PDF

Mar 08, 2021 (edited Apr 23, 2021)GTRL 2021 PosterReaders: Everyone
  • Keywords: Graph Neural Networks, Grassmann Manifold, Manifold Embedding, Graph Representation Learning, Low Rank Approximation
  • TL;DR: This work presents a new strategy of abstracting graph data to low-dimensional Grassmann points and proposes an effective graph pooling mechanism.
  • Abstract: Geometric deep learning that employs the geometric and topological features of data has attracted increasing attention in deep neural networks. Learning the intrinsic structure property of data is a crucial step for dimensionality reduction and effective feature extraction. This paper develops Grassmann graph embedding, which combines graph convolutions to capture the main components within graphs' hidden representations. Each set of featured graph nodes is mapped to a point on a Grassmann matrix manifold through Singular Value Decomposition, which is then embedded into a symmetric matrix space that approximates denoised second-order feature information. The view of treating nodes as a set could inspire many potential applications. In particular, we propose Grassmann (global graph) pooling that can connect with any graph convolution for graph neural networks. The Grassmann pooling achieves state-of-the-art performance on a variety of graph prediction benchmarks.
  • Poster: png
1 Reply

Loading