Abstract: We study the ability of one layer Graph Attention Networks (GAT) to achieve perfect node classification for a simple synthetic data model called the contextual stochastic block model (CSBM). We determine a \textit{positive} CSBM parameter regime such that GAT achieves perfect classification and a \textit{negative} CSBM parameter regime such that GAT fails to achieve perfect classification. For the positive result we use a generalized attention mechanism of the original~\citep{Velickovic2018GraphAN}. For the negative result we consider a fixed attention mechanism which is determined using the labels of the nodes. We pose two questions. \textit{Is the condition of GAT for achieving perfect classification better than that of a simple community detection method, i.e., thresholding the second principal eigenvector of the adjacency matrix~\citep{Abbe2018}?} The answer to this question is negative, and it depends on the parameter regime of the CSBM distribution. This happens because the graph information is coupled with the feature information using the operation of matrix multiplication. However, such matrix multiplication operation can be detrimental for perfect node classification. The second question is, \textit{is the condition of GAT for achieving perfect classification better than that of simple graph convolution (GCN)~\citep{kipf:gcn}?} We show that GAT is better than GCN if the attention mechanism of GAT is a Lipschitz function, while it is not better if it is not a Lipschitz function.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Theory (eg, control theory, learning theory, algorithmic game theory)
1 Reply
Loading