Well-conditioned Spectral Transforms for Dynamic Graph RepresentationDownload PDF

Published: 24 Nov 2022, Last Modified: 05 May 2023LoG 2022 PosterReaders: Everyone
Keywords: graph neural networks, low-rank self-attention, power-method SVD, framelet graph transform
TL;DR: This work establishes a fully-spectral framework to capture informative long-range temporal interactions of a dynamic system at linear complexity.
Abstract: This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.
Type Of Submission: Full paper proceedings track submission (max 9 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
PDF File: pdf
Supplementary Materials: zip
Type Of Submission: Full paper proceedings track submission.
Software: https://github.com/bzho3923/GNN_SpedGNN
Poster: png
Poster Preview: png
5 Replies

Loading