Track: Tiny Paper Track
Keywords: Multiscale Graph Representations, Cross-Modal Biological Data Integration, Graph Attention Networks, Multimodal Graph Autoencoders, Hierarchical Graph Structures, Biological Systems, Disease Classification, Cell-Type Annotation
TL;DR: We propose a multiscale graph representation learning framework that unifies biological modalities by embedding them into a multiscale latent space, preserving cross-scale interactions and improving interpretability.
Abstract: We present a novel multiscale representation of biological life that captures the complexity of cellular and molecular systems while ensuring interpretability and generalization across modalities. Our framework, which combines graph attention networks and multimodal graph autoencoders, learns shared embeddings across different biological scales while enforcing cross-modal alignment. We demonstrate the effectiveness of our approach on downstream tasks such as disease classification and cell-type annotation, and show that it outperforms single-modality methods. Our results also highlight the importance of cross-modal alignment in biological data integration, and demonstrate the scalability of our approach on large-scale datasets.
Submission Number: 22
Loading