Alignment of MPNNs and Graph Transformers

Published: 17 Jun 2024, Last Modified: 21 Aug 2024ICML 2024 Workshop GRaMEveryoneRevisionsBibTeXCC BY 4.0
Track: Proceedings
Keywords: Graph Neural Networks, Transformers
TL;DR: We show MPNNs and Graph Transfomers can be aligned in the context of algorithmic reasoning
Abstract: As the complexity of machine learning (ML) model architectures increases, it is important to understand to what degree simpler and more efficient architectures can align with their complex counterparts. In this paper, we investigate the degree to which a Message Passing Neural Network (MPNN) can operate similarly to a Graph Transformer. We do this by training an MPNN to align with the intermediate embeddings of a Relational Transformer (RT). Throughout this process, we explore variations of the standard MPNN and assess the impact of different components on the degree of alignment. Our findings suggest that an MPNN can align to RT and the most important components that affect the alignment are the MPNN's permutation invariant aggregation function, virtual node and layer normalisation.
Submission Number: 5
Loading