Abstract: The multisource unsupervised domain adaptation (MUDA) scenario poses a significant challenge in the field of intelligent fault diagnosis (IFD), where the goal is to transfer the knowledge learned from multiple labeled source domains to an unlabeled target domain. Existing IFD-oriented MUDA approaches frequently fail to recognize the distinct importance of each source domain relative to specific target samples, or lack flexibility in integrating diagnostic insights from multiple sources. In response, a novel MUDA approach is proposed for IFD, termed point-to-set metric-gated mixture of experts (PSMMoEs). This method leverages a mixture-of-experts (MoEs) framework to automatically integrate the complementary information from multiple source domains. It develops a deep point-to-set distance (PSD) metric learning technique within the MoE’s gating mechanism, effectively fusing domain-specific features by assessing the similarity between individual target samples and each source domain. The method ensures balanced training across progressive stages, harmonizing multitask learning with joint training for the MoE framework. Furthermore, a multilayer maximum mean discrepancy (MMD) measurement is employed for domain alignment, ensuring feature alignment across different domains at multiple levels. In order to assess the efficacy of the proposed method, it is compared with several leading domain adaptation methods on publicly available and laboratory-based rotating machinery fault datasets. The experimental results demonstrate superior classification and adaptation capabilities of the proposed fault diagnosis method.
External IDs:dblp:journals/tnn/YangZLLLC25
Loading