Aggregation Buffer: Revisiting DropEdge with a New Parameter Block

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We propose Aggregation Buffer, a plug-in parameter block designed to address the fundamental limitations of DropEdge and significantly improve the robustness of various GNN architectures.
Abstract: We revisit DropEdge, a data augmentation technique for GNNs which randomly removes edges to expose diverse graph structures during training. While being a promising approach to effectively reduce overfitting on specific connections in the graph, we observe that its potential performance gain in supervised learning tasks is significantly limited. To understand why, we provide a theoretical analysis showing that the limited performance of DropEdge comes from the fundamental limitation that exists in many GNN architectures. Based on this analysis, we propose **Aggregation Buffer**, a parameter block specifically designed to improve the robustness of GNNs by addressing the limitation of DropEdge. Our method is compatible with any GNN model, and shows consistent performance improvements on multiple datasets. Moreover, our method effectively addresses well-known problems such as degree bias or structural disparity as a unifying solution. Code and datasets are available at https://github.com/dooho00/agg-buffer.
Lay Summary: Randomly removing parts of the input is a common way to help machine learning models handle data variation. In graph-structured data—like social networks, where items are linked by edges—a technique called “DropEdge” removes some connections during training to improve reliability. However, we observe that its effectiveness is significantly limited in practice. Our analysis reveals that the issue lies in how graph models aggregate information from connected nodes. To address this, we introduce a post-training component called the “Aggregation Buffer.” We attach it to a trained model to improve its ability to handle varying connection patterns. In tests on 12 datasets, it consistently and significantly improves performance. Our work highlights the importance of edge-robustness—an often-overlooked issue—and offers a simple yet effective way to enhance graph models after training.
Link To Code: https://github.com/dooho00/agg-buffer
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph neural networks, DropEdge, data augmentation, edge robustness, node classification, degree bias, structural disparity
Submission Number: 10489
Loading