Abstract: Graph data collected from the real world often contains noise, making it imperative to develop robust representation learning tools for graphs. While existing research has primarily focused on feature smoothing, the robustness of the underlying geometric structure is frequently overlooked. In addition, the prevalent use of the $\mathbb {L}_{2}$-norm for achieving global smoothness in graph neural networks shrinks many local characteristics, limiting their expressivity on a node's neighboring information. This article introduces novel regularizers designed to address noise in both feature and structural aspects of graph data. We employ the alternating direction method of multipliers (ADMM) to optimize the objective function. Our proposed approach effectively prevents oversmoothing graph signal representations when applying multiple layers and ensures convergence to optimal solutions. Empirical results from our study demonstrate the superior performance of our proposed DoT over popular graph convolutions, especially in scenarios where the graph is heavily contaminated.
External IDs:doi:10.1109/tpami.2024.3393131
Loading