Hierarchical Mixture of Topological Experts for Molecular Property Prediction

Published: 06 Mar 2025, Last Modified: 21 Jul 2025ICLR 2025 Workshop LMRLEveryoneRevisionsBibTeXCC BY 4.0
Track: Full Paper Track
Keywords: Mixture-of-Experts, MoE, Molecular Property Prediction, Graph Neural Networks, GNN, Topology-Aware, Multi-Scale Representations
TL;DR: Hierarchical Mixture-of-Experts (MoE) model that learns molecular properties across atomic, substructure, and whole-molecule levels for improved prediction accuracy.
Abstract: Molecular property prediction enables rapid identification of promising drug candidates by forecasting key attributes such as bioactivity and toxicity. The relationship between molecular structure and properties spans multiple scales—from individual atoms to functional groups to the overall molecular framework. Depending on the property task and the target molecule’s scaffold, prediction may require focusing on specific substructures or the entire molecular configuration. This observation suggests that selectively attending to relevant structural features at different scales can improve prediction accuracy. In this light, we propose HierMolMoE, a hierarchical mixture-of-experts framework that learns specialized predictive models at three natural granularities of molecular graphs: atom-level, motif-level, and global-level. Our model integrates expert networks at each level with a high-level gating mechanism, and each expert is tailored to capture the unique topological semantics of molecular groups sharing similar scaffolds. Experiments on benchmark datasets demonstrate that HierMolMoE outperforms existing GNN-based mixture-of-experts approaches for molecular property prediction, highlighting its ability to learn robust structure–property relationships across scales.
Attendance: Kiwoong Yoo
Submission Number: 61
Loading