Keywords: Mixture-of-Experts (MoE), Knowledge Graph, Fairness-Aware Learning, Multimodal Representation Learning, Distribution Shift Robustness
TL;DR: KG-MoE couples knowledge graphs with Mixture-of-Experts routing, improving robustness, interpretability, and subgroup fairness across multimodal tasks.
Abstract: Mixture-of-Experts architectures scale model capacity efficiently but remain limited by correlation-driven routing, lack of explicit knowledge grounding, and subgroup disparities in high-stakes domains. We propose KG-MoE, a knowledge-based and fairness-aware MoE framework that integrates structured knowledge graphs into expert specialization and employs adversarial debiasing to reduce subgroup risk. A dynamic gating network routes inputs across modality-specific experts while retrieved subgraphs constrain reasoning and guide explanation generation. We derive theoretical bounds showing that knowledge grounding reduces excess risk under distribution shift and that fairness regularization improves worst-group generalization. Empirically, KG-MoE achieves state-of-the-art performance across multimodal benchmarks, including dermoscopic, clinical, and histopathology tasks in dermatology, while reducing demographic parity gaps by more than 50% relative to foundation model baselines. Ablation studies confirm the dual benefit of knowledge integration and fairness constraints for both robustness and equity, and qualitative analysis demonstrates knowledge-based explanations aligned with domain reasoning. Our results position KG-MoE as a general paradigm for trustworthy, interpretable, and fair multimodal learning systems.
Supplementary Material: pdf
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 10561
Loading