Abstract: Most Graph Neural Networks are based on the principle of message-passing, where all neighboring nodes exchange messages with each other simultaneously. We introduce the Flood and Echo Net, a novel architecture that aligns neural computation with the principles of distributed algorithms directly on the level of message-passing. In our method, nodes sparsely activate upon receiving a message, leading to a wave-like activation pattern that traverses the entire graph. Through these sparse but parallel activations, the Net becomes provably more efficient in terms of message complexity. Moreover, the mechanism's structure to generalize across graphs of varying sizes positions it as a practical architecture for the task of graph algorithmic reasoning. We empirically validate the Flood and Echo Net improves generalization to larger graph sizes, including the SALSA-CLRS benchmark, improving graph accuracy for instances 100 times larger than during training.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Moshe_Eliasof1
Submission Number: 5502
Loading