Is Expressivity Essential for the Predictive Performance of Graph Neural Networks?

Published: 10 Oct 2024, Last Modified: 09 Nov 2024SciForDL PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We study the impact of expressivity on the predictive performance of graph neural networks by performing knowledge distillation from highly expressive teacher GNNs to less expressive student GNNs.
Abstract: Motivated by the large amount of research on the expressivity of GNNs, we study the impact of expressivity on the predictive performance of GNNs. By performing knowledge distillation from highly expressive teacher GNNs to less expressive student GNNs, we demonstrate that knowledge distillation reduces the predictive performance gap between teachers and students significantly. As knowledge distillation does not increase the expressivity of the student GNN, it follows that most of this gap in predictive performance cannot be due to expressivity.
Style Files: I have used the style files.
Debunking Challenge: This submission is an entry to the debunking challenge.
Submission Number: 33
Loading