A Contrastive Learning and Graph-based Approach for Missing Modalities in Multimodal Federated Learning
Abstract: Federated Learning has emerged as a decentralized method for training machine learning models using distributed data sources. It ensures privacy by allowing clients to collaboratively learn a shared global model while keeping their data stored locally. However, a significant challenge arises when dealing with missing modalities in clients’ datasets, where certain features or modalities are unavailable or incomplete, leading to heterogeneous data distribution. Previous studies have addressed this issue, but they fall short in addressing generalizability across diverse, unobserved individuals. This study introduces MIFL, a novel framework for handling modality-missing clients in Multimodal Federated Learning. MIFL utilizes a unique approach, optimizing a client’s local model with existing modalities while incorporating absent modalities from clients. Aggregation of these models is performed through a graph-based attentive aggregation method, maintaining generalized characteristics by updating a global model averaged across clients. Our experimental results demonstrate the effectiveness of MIFL across various client configurations with statistical heterogeneity, showcasing its potential for addressing the challenge of missing modalities in Federated Learning.
Loading