Hyperparameter Optimization of Graph Neural Networks for the OpenCatalyst Dataset: A Case StudyDownload PDF

Published: 22 Nov 2022, Last Modified: 05 May 2023AI4Mat 2022 PosterReaders: Everyone
Keywords: Hyperparameter Optimization, Graph Neural Networks, OpenCatalyst Dataset, Multi-Fidelity Optimization
TL;DR: We build a framework to find performant hyperparameters on lower fidelities of the OpenCatalyst dataset and use them to run large scale training experiments.
Abstract: The proliferation of deep learning (DL) techniques in recent years has often resulted in the creation of progressively larger datasets and deep learning architectures. As the expressive power of DL models has grown, so has the compute capacity needed to effectively train the models. One such example is the OpenCatalyst dataset in the emerging field of scientific machine learning, which has elevated the compute requirements needed to effectively train graph neural networks (GNNs) on complex scientific data. The extensive compute complexity involved in training GNNs on the OpenCatalyst dataset makes it very costly to perform hyperparameter optimization (HPO) using traditional methods, such as grid search or even Bayesian optimization-based approaches. Given this challenge, we propose a novel methodology for effective, cost-aware HPO on GNN training on OpenCatalyst that leverages a multi-fidelity approach with experiments on reduced datasets, hyperparameter importance, and computational budget considerations. We show speed ups by over 50 percent when performing hyperparameter optimization of the E(n)-GNN model on the OpenCatalyst dataset.
Paper Track: Behind the Scenes
Submission Category: AI-Guided Design
0 Replies

Loading