Abstract: Continuous Few-Shot Relation Extraction (CFRE) is a task in continual learning that closely mirrors real-world scenarios. This task emphasizes the acquisition of new relations from a limited set of training samples while maintaining the retention of knowledge related to previously learned relations. The main challenges of CFRE include catastrophic forgetting and balancing between stability and adaptability. In existing research, knowledge distillation methods are commonly used to alleviate catastrophic forgetting. However, the knowledge distillation methods currently applied to CFRE tasks, whether single-teacher or multi-teacher models, do not perform data partitioning, which impacts the learning of new relations and limits the recognition performance of old relations. To tackle these challenges, we propose a CFRE model based on Task-Oriented Dynamic Knowledge Distillation (TODKD). Specifically, we designed a dual-level task space separation knowledge distillation method that utilizes different teacher models at various stages to guide the sample training of relations belonging to different task stages, thereby alleviating catastrophic forgetting. To alleviate sample sparsity in few-shot scenarios, we use memory samples as an example and design prompt templates to assist LLMs in producing diverse data samples. Additionally, we designed dynamic weight loss to balance the model’s stability and adaptability effectively, resulting in a more uniform distribution of relation features across different tasks. Comprehensive experiments on the FewRel and TACRED datasets demonstrate that our TODKD model outperforms multiple competitive baseline approaches. Importantly, our model shows a marked enhancement compared to prior state-of-the-art techniques in the final task.
Loading