From Continuous Dynamics to Graph Neural Networks: Neural Diffusion and Beyond

Published: 22 Feb 2024, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Graph neural networks (GNNs) have demonstrated significant promise in modelling relational data and have been widely applied in various fields of interest. The key mechanism behind GNNs is the so-called message passing where information is being iteratively aggregated to central nodes from their neighbourhood. Such a scheme has been found to be intrinsically linked to a physical process known as heat diffusion, where the propagation of GNNs naturally corresponds to the evolution of heat density. Analogizing the process of message passing to the heat dynamics allows to fundamentally understand the power and pitfalls of GNNs and consequently informs better model design. Recently, there emerges a plethora of works that proposes GNNs inspired from the continuous dynamics formulation, in an attempt to mitigate the known limitations of GNNs, such as oversmoothing and oversquashing. In this survey, we provide the first systematic and comprehensive review of studies that leverage the continuous perspective of GNNs. To this end, we introduce foundational ingredients for adapting continuous dynamics to GNNs, along with a general framework for the design of graph neural dynamics. We then review and categorize existing works based on their driven mechanisms and underlying dynamics. We also summarize how the limitations of classic GNNs can be addressed under the continuous framework. We conclude by identifying multiple open research directions.
Certifications: Survey Certification
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: We have made the following major changes. All the changes are highlighted in Red * We have added a table (Table 1) summarizing the key notations used in the survey. * We have modified the start of Section 3 to introduce the four problems and a paragraph summarizing the key differences between each class of dynamics. * We have rewritten Section 5 on training graph neural dynamics, explaining both the forward and backward propagation explicitly. * We have added Section 6 that compares the computational complexity of each dynamics. * We have added Section 7 and Table 2, discussing the empirical benchmarks and various experiments for validating and comparing graph neural dynamics. * We have expanded Section 8 on future directions. ---- * We have added discussions on what dynamics to exclude from the scope of this work, and clarified the claims on graphs as approximation of geometries, as suggested by reviewer 1KdX
Assigned Action Editor: ~Rianne_van_den_Berg1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1717
Loading