Are Graph Attention Networks Attentive Enough? Rethinking Graph Attention by Capturing Homophily and HeterophilyDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Graph ML, attention mechanism
TL;DR: Propose a new attention mechanism for Graph ML
Abstract: Attention Mechanism has been successfully applied in Graph Neural Networks (GNNs). However, as messages propagate along the edges, the node embeddings for edge-connected nodes will become closer even though we can not ensure these nodes have similar features and labels, especially in heterophily graphs. The current attention mechanisms cannot adaptively extract information from the neighbors because they can not fully use the graph information in self-attention calculation. We introduce new a graph attention mechanism (GATv3) straightly involving the graphic information in the self-attention calculation, which can be aware of the homophily or heterophily of the graphs. We conduct an extensive evaluation in node classification tasks and show that using graphic information and features simultaneously can extract more diverse attention scores. Our code is available at https://github.com/anonymousSubscriber/G-GAT
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: General Machine Learning (ie none of the above)
Supplementary Material: zip
5 Replies

Loading