Keywords: Graph Clustering, Graph Transformer
Abstract: Graph clustering is a fundamental unsupervised task in graph mining. However, mainstream clustering methods are built on graph neural networks, thus inevitably suffer from the difficulty in long-range dependencies capturing. Moreover, current two-stage clustering scheme, consisting of representation learning and clustering, limits the ability of the graph encoder to fully exploit task-related information, resulting in suboptimal embeddings. In this work, we propose CTGC ($\textbf{C}$luster-Aware $\textbf{T}$ransformer for $\textbf{G}$raph $\textbf{C}$lustering) to mitigate these issues. Specifically, considering the excellence of transformer in long-range dependencies modeling, we first introduce transformer to graph clustering as the crucial graph encoder. To further enhance the task awareness of encoder during representation learning, we presents two mechanisms: momentum cluster-aware attention and cluster-aware regularization. In momentum cluster-aware attention, previous clustering results are adopted to guide the node embedding production with specially designed cluster-aware queries. Cluster-aware regularization is designed to fuse the cluster information into bordering nodes through minimizing the overlap between different clusters while maximizing the completeness of each cluster. We evaluate our method on seven real-world graph datasets and achieve superior results compared to existing state-of-the-art methods, demonstrating its effectiveness in improving the quality of graph clustering.
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 23
Loading