CrysGNN : Distilling pre-trained knowledge to enhance property prediction for crystalline materials.Download PDF

Published: 17 Mar 2023, Last Modified: 08 Sept 2024ml4materials-iclr2023 PosterReaders: Everyone
Keywords: Graph Neural Networks, GNN Pretraining, Knowledge Distillation, Crystalline Materials
TL;DR: his paper presents a new pre-trained GNN framework for crystalline materials, which enhances the property prediction accuracy of different SOTA property predictors, by injecting distilled pre-trained knowledge.
Abstract: In recent years, graph neural network (GNN) based approaches have emerged as a powerful technique to encode complex topological structure of crystal materials in an enriched representation space. These models are often supervised in nature and using the property-specific training data, learn relationship between crystal structure and different properties like formation energy, bandgap, bulk modulus, etc. Most of these methods require a huge amount of property-tagged data to train the system which may not be available for different properties. However, there is an availability of a huge amount of crystal data with its chemical composition and structural bonds. To leverage these untapped data, this paper presents CrysGNN, a new pre-trained GNN framework for crystalline materials, which captures both node and graph level structural information of crystal graphs using a huge amount of unlabelled material data. Further, we extract distilled knowledge from CrysGNN and inject into different state of the art property predictors to enhance their property prediction accuracy. We conduct extensive experiments to show that with distilled knowledge from the pre-trained model, all the SOTA algorithms are able to outperform their own vanilla version with good margins. We also observe that the distillation process provides a significant improvement over the conventional approach of finetuning the pre-trained model.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 5 code implementations](https://www.catalyzex.com/paper/crysgnn-distilling-pre-trained-knowledge-to/code)
0 Replies

Loading