PoincareNorm: Rethinking Over-smoothing beyond Dirichlet energy

25 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: over-smoothing, node similarity measure,normalization。
TL;DR: we propose a series of node similarity measures which generize Dirichlet energy and a normalization called PoincareNorm which generizes PairNorm.
Abstract:

Dirichlet energy is intuitive and commonly used to measure over-smoothing. However, Dirichlet energy can only capture information about the first-order derivative of features. In light of this, we propose a series of node similarity measures which are the energy of higher-order derivatives of features and generalize Dirichlet energy. After we rigorously analyze the property of proposed measures and its application to establish the sharp decay rate of Dirichlet energy under continuous diffusion or discrete random walk which is closely related to the first nonzero eigenvalue of graph Laplacian. Lastly, to address over-smoothing with respect to these measures, we propose a normalization termed PoincareNorm which generalizes PairNorm to control our proposed measures. We consider the semi-supervised node classification task in the scenario without missing features, PoincareNorm outperforms existing normalization methods.

Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5273
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview