Hypergraph Contrastive Sensor Fusion for Multimodal Fault Diagnosis in Induction Motors

Usman Ali, Ali Zia, Waqas Ali, Umer Ramzan, Abdul Rehman, Muhammad Tayyab Chaudhry, Wei Xiang

Published: 01 Jan 2026, Last Modified: 26 Jan 2026IEEE Sensors JournalEveryoneRevisionsCC BY-SA 4.0
Abstract: Reliable induction motor (IM) fault diagnosis is vital for industrial safety and operational continuity, mitigating costly unplanned downtime. Conventional approaches often struggle to capture complex multimodal signal relationships, are constrained to unimodal data or single fault types, and degrade under noise or crossdomain shifts. We propose the Multimodal Hypergraph Contrastive Attention Network (MM-HCAN), a unified framework that integrates multi-head attention with a hypergraph architecture and contrastive learning to jointly model global and localised intra- and inter-modal dependencies beyond Euclidean embedding spaces. MM-HCAN’s gains are driven by three innovations: (i) a structured multimodal fusion scheme with dynamically constructed hyperedges encoding higher-order cross-modal relationships; (ii) modality-specific hypergraph Laplacians for fine-grained, context-aware embedding refinement; and (iii) triplet-based contrastive learning performed directly on the hypergraph manifold, yielding more discriminative, topology-respecting representations. The model enables simultaneous diagnosis of bearing, stator, and rotor faults, addressing the engineering need for consolidated diagnostics. Evaluated on three real-world benchmarks, MM-HCAN achieves up to 99.82% accuracy with strong cross-domain generalisation and robustness to noise. An ablation study confirms the contribution of each component. MM-HCAN offers a scalable and robust solution for comprehensive multi-fault diagnosis, supporting predictive maintenance and extended asset longevity in industrial environments. Code is available at https://github.com/EngrUsmaanAli/MM-HCAN.
Loading