Unveiling the Secrets of $^1$H-NMR Spectroscopy: A Novel Approach Utilizing Attention Mechanisms

Published: 27 Oct 2023, Last Modified: 11 Dec 2023AI4Mat-2023 PosterEveryoneRevisionsBibTeX
Submission Track: Papers
Submission Category: Automated Material Characterization
Keywords: NMR analysis, attention mapping, BERT
TL;DR: Attention layers in BERT models can reveal the connection between peaks in NMR-spectra and atoms
Abstract: The significance of Nuclear Magnetic Resonance (NMR) spectroscopy in organic synthesis cannot be overstated, as it plays a pivotal role in deducing chemical structures from experimental data. While machine learning has predominantly been employed for predictive purposes in the analysis of spectral data, our study introduces a novel application of a transformer-based model's attention weights to unravel the underlying "language" that correlates spectral peaks with their corresponding atom in the chemical structures. This attention mapping technique proves beneficial for comprehending spectra, enabling accurate assignment of spectra to the respective molecules. Our approach consistently achieves correct assignment of $^1$H-NMR experimental spectra to the respective molecules in a reaction, with an accuracy exceeding 95\%. Furthermore, it consistently associates peaks with the correct atoms in the molecule, achieving a remarkable peak-to-atom match rate of 71\% for exact match and 89\% of close shift matching ($\pm$ 0.59ppm). This framework exemplifies the capability of harnessing the attention mechanism within transformer models to unveil the intricacies of spectroscopic data. Importantly, this approach can readily be extended to other types of spectra, showcasing its versatility and potential for broader applications in the field.
Digital Discovery Special Issue: Yes
Submission Number: 56
Loading