Keywords: Graph Neural Networks, Graph Transformers, Graph Representation Learning
Abstract: Graph Transformers (GTs) have emerged as a powerful paradigm for graph representation learning due to their ability to model diverse node interactions.
However, existing GTs often rely on intricate architectural designs tailored to specific interactions, limiting their flexibly.
To address this, we propose a unified hierarchical mask framework that reveals an underlying equivalence between model architecture and attention mask construction.
This framework enables a consistent modeling paradigm by capturing diverse interactions through carefully designed attention masks.
Theoretical analysis under this framework demonstrates that the probability of correct classification positively correlates with the receptive field size and label consistency, leading to a fundamental design principle:
An effective attention mask should ensure both a sufficiently large receptive field and a high level of label consistency.
While no single existing mask satisfies this principle across all scenarios, our analysis reveals that hierarchical masks offer complementary strengths—motivating their effective integration.
Then, we introduce M$^3$Dphormer, a Mixture-of-Experts based Graph Transformer with Multi-Level Masking and Dual Attention Computation.
M$^3$Dphormer incorporates three theoretically grounded hierarchical masks and employs a bi-level expert routing mechanism to adaptively integrate multi-level interaction information.
To ensure scalability, we further introduce a dual attention computation scheme that dynamically switches between dense and sparse modes based on local mask sparsity.
Extensive experiments across multiple benchmarks demonstrate that M$^3$Dphormer achieves state-of-the-art performance,
validating the effectiveness of our unified framework and model design.
Supplementary Material: zip
Primary Area: Deep learning (e.g., architectures, generative models, optimization for deep networks, foundation models, LLMs)
Submission Number: 22861
Loading