Parameter-Free Molecular Classification and Regression with Gzip

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: cheminformatics, chemistry, language, compression
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: In recent years, natural language processing approaches to machine learning, most prominently deep neural network-based transformers, have been extensively applied to molecular classification and regression tasks, including the prediction of pharmacokinetic and quantum-chemical properties. However, models based on deep neural networks generally require extensive training, large training data sets, and resource-consuming hyperparameter tuning. Recently, a low-resource and universal alternative to deep learning approaches based on Gzip compression for text classification has been proposed, which reportedly performs surprisingly well compared to large language models such as BERT, given its conceptually simplistic nature. Here, we adapt the proposed method to support multiprocessing, multi-class classification, class-weighing, and regression and apply it to classification and regression tasks on various data sets of molecules from the organic chemistry, biochemistry, drug discovery, and material science domains. We further propose converting numerical descriptors into string representations, enabling the integration of language input with domain-informed descriptors. Our results show that the method can be used to classify and predict a variety of properties of molecules, can reach the performance of large-scale chemical language models in a subset of tasks, and has the potential for application in information retrieval from large chemical databases.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7626
Loading