Adaptive Message Passing: A General Framework to Mitigate Oversmoothing, Oversquashing, and Underreaching

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY-NC 4.0
TL;DR: We propose a message-passing framework for graph learning to adapt the number layers during training and filter outgoing messages, in order to control oversmoothing, oversquashing, and underreaching.
Abstract: Long-range interactions are essential for the correct description of complex systems in many scientific fields. The price to pay for including them in the calculations, however, is a dramatic increase in the overall computational costs. Recently, deep graph networks have been employed as efficient, data-driven models for predicting properties of complex systems represented as graphs. These models rely on a message passing strategy that should, in principle, capture long-range information without explicitly modeling the corresponding interactions. In practice, most deep graph networks cannot really model long-range dependencies due to the intrinsic limitations of (synchronous) message passing, namely oversmoothing, oversquashing, and underreaching. This work proposes a general framework that \textit{learns to mitigate} these limitations: within a variational inference framework, we endow message passing architectures with the ability to adapt their depth and filter messages along the way. With theoretical and empirical arguments, we show that this strategy better captures long-range interactions, by competing with the state of the art on five node and graph prediction datasets.
Lay Summary: We propose a neural network for graph-structured data that automatically learns the number of hidden layers to use. This is important because neural network for graphs work by repeatedly exchanging information across the entities of a graph, and the more message exchange iterations the more information is spread. Some tasks require information to be spread between distant entities, therefore it is important to learn the number of layers rather than manually trying different values every time. In addition, we learn to filter some of the irrelevant messages in order to avoid information overload during the spreading process.
Link To Code: https://github.com/nec-research/Adaptive-Message-Passing
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph Machine Learning, Variational Inference, Oversmoothing, Oversquashing, Underreaching, Depth Learning
Submission Number: 1540
Loading