Keywords: Sparse Autoencoders, Chemistry Language Models, Mechanistic Interpretability, Molecular Representations, Computational Chemistry, Feature Steering, Foundation Models, Latent Space Analysis
TL;DR: Sparse autoencoders trained on chemical language models extract interpretable features corresponding to chemical substructures, physicochemical properties, and pharmacological functions, while demonstrating causal steering capabilities.
Abstract: Since the advent of machine learning, interpretability has remained a persistent challenge, becoming increasingly urgent as generative models support high-stakes applications in drug and material discovery. Recent advances in large language model (LLM) architectures have yielded chemistry language models (CLMs) with impressive capabilities in molecular property prediction and molecular generation. However, how these models internally represent chemical knowledge remains poorly understood. In this work, we extend sparse autoencoder techniques to uncover and examine interpretable features within CLMs. Applying our methodology to the Foundation Models for Materials (FM4M) SMI-TED chemistry foundation model, we extract semantically meaningful latent features and analyse their activation patterns across diverse molecular datasets. Our findings reveal that these models encode a rich landscape of chemical concepts. We identify correlations between specific latent features and distinct domains of chemical knowledge, including structural motifs, physicochemical properties, and pharmacological drug classes. Our approach provides a generalisable framework for uncovering latent knowledge in chemistry-focused AI systems. This work has implications for both foundational understanding and practical deployment; with the potential to accelerate computational chemistry research.
Submission Track: Findings, Tools & Open Challenges
Submission Category: AI-Guided Design + Automated Material Characterization
Institution Location: {Cape Town, South Africa}, {Oxford, United Kingdom}
Submission Number: 104
Loading