Abstract: Highlights•We design a effective data augmentation strategy through knowledge distillation.•We construct four types of positive and negative samples for contrastive learning.•We propose a biased selection module to select biased samples.•MKDCL achieves state-of-the-art performance and is adaptable to other backbones.
Loading