Meta-Learning With Learnable Margin Triplet Loss for Few-Shot Fault Diagnosis

Published: 2025, Last Modified: 05 Jan 2026IEEE Trans. Instrum. Meas. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The application of deep learning (DL)-based methods for machinery and equipment fault diagnosis (FD) has yielded significant results. However, the success of DL is contingent upon the availability of large amounts of labeled data, which is often limited due to the scarcity and value of fault samples. Even neural networks with advanced structures face challenges in performing effective diagnosis with few samples. To address this issue, we develop a novel meta-learning with learnable margin triplet loss (MLLMTL) for few-shot FD (FSFD). The method constructs meta-learning (ML) tasks by blending multisource-domain data. The cross platform and variable operating conditions meta-tasks can better enhance the learning capability of the model. Furthermore, a novel learnable margin triplet loss (LMTL) function is proposed in this article. First, conventional logical margins are transformed into learnable physical margins, and the relationship between positive and negative samples is transformed into an independent relationship with the margins. Second, we force the similarity of dissimilar anchor features to maintain a large margin by using the feature distance between any two anchors as a loss term. In comparison with the conventional triplet loss (TL) function, LMTL has the capacity to facilitate a more nuanced differentiation of distinct fault features within intricate scenarios. Results on two different diagnostic tasks demonstrate that both proposed methods outperform several state-of-the-art FSFD methods.
Loading