A Graph transformer defense against graph perturbation by A flexible-pass filter

Published: 06 Feb 2024, Last Modified: 19 May 2025OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: Graph perturbation hinders graph models in real applications, and thus defense methods against graph perturbation have been attracting increasing attention. However, current defense methods limit expressiveness and demand expert knowledge. To overcome these issues, in this paper, we propose a flexible-frequency graph transformer, building on the powerful expressive ability of self-attention. Specifically, we design a frequency-extraction self-attention with three heads to extract multi-frequency representations, i.e., low-frequency representation, hybrid-frequency representation, and high-frequency representation. An adaptive fusion method is then designed to combine diverse representations to output a flexible-frequency representation. This improves the model’s expressive ability and also enhances defense against graph perturbation by utilizing comprehensive information. In addition, we adaptively capture robust graph filters within self-attention, eliminating the need for expert knowledge. To boost self-attention’s effectiveness, we integrate graph learning to capture graph information before conducting node representation learning within self-attention. Furthermore, we theoretically analyze the feasibility of our proposed method. Extensive experiments demonstrate that our proposed method offers a dynamic and effective defense against graph perturbation compared to existing state-of-the-art methods.
Loading