Text-To-Energy: Accelerating Quantum Chemistry Calculations through Enhanced Text-to-Vector Encoding and Orbital-Aware Multilayer Perceptron

21 Sept 2023 (modified: 29 Jan 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: qantum mechanics, quantum chemistry, computational physics, density functional theory, material science, physics-informed machine learning
TL;DR: We introduced text-to-vector encoder and a novel MLP to rapidly and accurately predict material properties. Our work enables easier access to the quantum chemistry with low computational cost and facilitating further understanding of materials.
Abstract: Accurately predicting material properties remains a complex and computationally intensive task. In this work, we introduce Text-To-Energy (T2E), a novel approach combining text-to-vector encoding and a multilayer perceptron (MLP) for rapid and precise energy predictions. T2E begins by converting pivotal material attributes to a vector representation, followed by the utilization of an MLP block incorporating significant physical data. This novel integration of textual, physical, and quantum insights enables T2E to swiftly and accurately predict the total energy of material systems. The proposed methodology marks a significant departure from conventional computational techniques, offering a reduction in computational burden, which is imposed by particle count and their interactions, obviating the need for extensive quantum chemistry expertise. Comprehensive validation across a diverse range of atoms and molecules affirms the superior performance of T2E over state-of-the-art solutions such as DFT, FermiNet, and PsiFormer.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3410
Loading