MatBind: Probing the multimodality of materials science with contrastive learning

Published: 03 Mar 2025, Last Modified: 09 Apr 2025AI4MAT-ICLR-2025 SpotlightEveryoneRevisionsBibTeXCC BY 4.0
Submission Track: Multi-Modal Data for Materials Design - Tiny Paper
Submission Category: Automated Material Characterization
Keywords: multimodality, materials science encoding, contrastive learning, perovskite, materials lenses
TL;DR: We present MatBind, a model that for the first time aligns four modalities related to materials science: crystal structure, powder X-Ray Diffractogram, total density of states and text descriptions.
Abstract: Materials discovery depends critically on integrating information from multiple experimental and computational techniques, yet most tools today analyze these different data types in isolation. Here, we present MatBind, a model based on the ImageBind architecture that creates a unified embedding space across four key materials science modalities: density of states (DOS), crystal structures, text descriptions, and powder X-ray diffraction (pXRD) patterns. Using a hub-and-spoke architecture with crystal structure as the central modality, MatBind achieves cross-modal recall@1 performance of up to 98\% between directly aligned modalities and up to 73\% for pairs of modalities not explicitly trained together. Our model demonstrates the ability to make semantically meaningful connections across modalities, enabling researchers to query one type of materials data using another. Our analysis shows that combining multiple modalities can improve the model's ability to recognize important structural features like perovskite crystal systems. This approach lays the foundation for more integrated materials research platforms that can accelerate discovery by leveraging the collective knowledge encoded in materials databases.
Submission Number: 28
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview