GwAC: GNNs with Asynchronous Communication

Published: 18 Nov 2023, Last Modified: 29 Nov 2023LoG 2023 PosterEveryoneRevisionsBibTeX
Keywords: GNNs, Weisfeiler-Lehman, Oversmoothing, Undereaching
TL;DR: GNNs using asynchronous instead of synchronous communication (no aggregation)
Abstract: This paper studies the relation between Graph Neural Networks and Distributed Computing Models to propose a new framework for Learning in Graphs. Current Graph Neural Networks (GNNs) are closely related to the synchronous model from distributed computing. Nodes operate in rounds and receive neighborhood information aggregated and at the same time. Our new framework, on the other hand, proposes GNNs with Asynchronous Communication: Every message is received individually and at potentially different times. We prove this framework must be at least as expressive as the existing synchronous framwork. We further analyze GwAC theoretically and practically with regard to several GNN problems: Expressiveness beyond 1-Weisfeiler Lehman (1WL), Underreaching, and Oversmoothing. GwAC shows promising improvements for all problems. We finish with a practical study on how to implement GwAC GNNs efficiently.
Supplementary Materials: zip
Submission Type: Full paper proceedings track submission (max 9 main pages).
Agreement: Check this if you are okay with being contacted to participate in an anonymous survey.
Software: https://github.com/lukasjf/gwac/
Poster: png
Poster Preview: png
Submission Number: 78
Loading