TL;DR: We propose CoE, a collaborative expert framework that effectively fuses heterogeneous information sources through adaptive cooperation and large-margin optimization.
Abstract: Fusing heterogeneous information remains a persistent challenge in modern data analysis. While significant progress has been made, existing approaches often fail to account for the inherent heterogeneity of object patterns across different semantic spaces. To address this limitation, we propose the **Cooperation of Experts (CoE)** framework, which encodes multi-typed information into unified heterogeneous multiplex networks. By transcending modality and connection differences, CoE provides a powerful and flexible model for capturing the intricate structures of real-world complex data. In our framework, dedicated encoders act as domain-specific experts, each specializing in learning distinct relational patterns in specific semantic spaces. To enhance robustness and extract complementary knowledge, these experts collaborate through a novel **large margin** mechanism supported by a tailored optimization strategy. Rigorous theoretical analyses guarantee the framework’s feasibility and stability, while extensive experiments across diverse benchmarks demonstrate its superior performance and broad applicability.
Lay Summary: In today's world, data comes in many forms—images, text, social interactions—and understanding how to combine them is a major challenge. Our research introduces **CoE**, which helps the model to better make sense of this diverse information. Instead of using one model to learn from all data types, we train multiple experts, each specialized in a different type of information. These experts work together rather than compete, learning from both individual data types and their combinations.
To make the system more reliable, we introduce a way for the experts to adjust their influence depending on how confident they are in their predictions. We also design an optimization method that ensures experts don't just agree—they make accurate and meaningful decisions together. Our experiments show that CoE outperforms existing models on a wide range of tasks and remains robust even when data is noisy or incomplete. This opens up new possibilities for building smarter, more adaptive systems that can handle the messy, complex information of the real world.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Heterogeneous Information; Multiplex Network; Expert Mechanism
Submission Number: 10673
Loading