Scalable Geometric Deep Learning on Molecular GraphsDownload PDF

Published: 22 Oct 2021, Last Modified: 14 Jul 2024NeurIPS-AI4Science PosterReaders: Everyone
Keywords: geometric deep learning, scaling, molecular machine learning, applied machine learning, scientific computing
TL;DR: We train four geometric deep learning architectures on molecular graphs across hundreds of GPUs and investigate the scaling behavior.
Abstract: Deep learning in molecular and materials sciences is limited by the lack of integration between applied science, artificial intelligence, and high-performance computing. Bottlenecks with respect to the amount of training data, the size and complexity of model architectures, and the scale of the compute infrastructure are all key factors limiting the scaling of deep learning for molecules and materials. Here, we present LitMatter, a lightweight framework for scaling molecular deep learning methods. We train four graph neural network architectures on over 400 GPUs and investigate the scaling behavior of these methods. Depending on the model architecture, training time speedups up to 60x are seen. Empirical neural scaling relations quantify the model-dependent scaling and enable optimal compute resource allocation and the identification of scalable molecular geometric deep learning model implementations.
Track: Original Research Track
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](
1 Reply