A CNN-Based Born-Again TSK Fuzzy Classifier Integrating Soft Label Information and Knowledge Distillation

Published: 01 Jan 2023, Last Modified: 17 Apr 2025IEEE Trans. Fuzzy Syst. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This article proposes a CNN-based born-again Takagi–Sugeno–Kang (TSK) fuzzy classifier denoted as CNNBaTSK. CNNBaTSK achieves the following distinctive characteristics: 1) CNNBaTSK provides a new perspective of knowledge distillation with a noniterative learning method (least learning machine with knowledge distillation, LLM-KD) to solve the consequent parameters of fuzzy rule, where consequent parameters are trained jointly on the ground-truth label loss, knowledge distillation loss, and regularization term; 2) with the inherent advantage of the fuzzy rule, CNNBaTSK has the capability to express the dark knowledge acquired from the CNN in an interpretable manner. Specifically, the dark knowledge (soft label information) is partitioned into five fixed antecedent fuzzy spaces. The centers of each soft label information in different fuzzy rules are {0, 0.25, 0.5, 0.75, 1}, which may have corresponding linguistic explanations: {very low, low, medium, high, very high}. For the consequent part of the fuzzy rule, the original features are employed to train the consequent parameters that ensure the direct interpretability in the original feature space. The experimental results on the benchmark datasets and the CHB-MIT EEG dataset demonstrate that CNNBaTSK can simultaneously improve the classification performance and model interpretability.
Loading