Abstract: State-of-the-art models for 3D molecular generation are based on significant inductive biases: SE(3) equivariance, permutation invariance and graph message‑passing networks to capture local chemistry, yet the generated molecules struggle with physical plausibility.
We introduce TABASCO which relaxes these assumptions: The model has a standard non-equivariant transformer architecture, treats atoms in a molecule as sequences and does not explicitly model bonds. The absence of equivariant layers and message passing allows us to simplify the model architecture and scale data throughput.
On the GEOM‑Drugs and QM9 benchmarks TABASCO achieves state-of-the-art PoseBusters validity and delivers inference roughly 10x faster than the strongest baseline, while exhibiting emergent rotational equivariance without hard-coded symmetry.
Our work offers a blueprint for training minimalist, high‑throughput, unconditional generative models and the resulting architecture is readily extensible to future conditional tasks.
We provide a link to our implementation at https://github.com/carlosinator/tabasco.
Submission Type: Regular submission (no more than 12 pages of main content)
Code: https://github.com/carlosinator/tabasco
Supplementary Material: zip
Assigned Action Editor: ~Miguel_Ángel_Bautista1
Submission Number: 6521
Loading