Solvaformer: Minimizing Geometric Redundancy for Scalable Solubility Prediction

Published: 02 Mar 2026, Last Modified: 11 Mar 2026ICLR 2026 Workshop GRaM PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: tiny paper (up to 4 pages)
Keywords: SE(3)-Equivariance, Molecular Solubility, Scale and Simplicity
TL;DR: Solvaformer achieves scalable and accurate solubility prediction by eliminating geometric redundancy, using a hybrid architecture that combines strict intramolecular SE(3)-equivariance with simplified scalar intermolecular attention.
Abstract: Accurate prediction of small molecule solubility requires balancing physical fidelity with computational scalability. While geometric deep learning offers superior inductive biases, applying full SE(3)-equivariance to dynamic multi-component systems introduces geometric redundancy and high computational cost. We introduce Solvaformer, a graph transformer designed to achieve simplicity at scale by selectively grounding interactions in geometry. The architecture challenges the need for global equivariance: it applies strict SE(3)-equivariant attention only to rigid intramolecular structures, while modeling fluid intermolecular interactions via computationally efficient scalar attention. While Solvaformer demonstrates strong performance (approaching the DFT baseline), we also report a significant negative result: a simpler MPNN augmented with physics-informed partial charges (MLIPs) slightly outperforms the explicit Solvaformer architecture. This suggests that for scalar solubility prediction, high-quality electronic descriptors may render end-to-end equivariant architectures redundant. Our findings highlight that geometric redundancy can be minimized either architecturally (Solvaformer) or via decoupled feature generation (MPNN w/ MLIPs), offering two scalable paths for solution-phase modeling.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 35
Loading