Abstract: The task of zero-shot relation extraction is a very important research topic in the field of information extraction, which can effectively alleviate the issue of no training samples for some relations. Existing zero-shot relation extraction methods based on pre-trained language models (PLMs) always extract features from sentences, but cannot provide satisfactory semantic representations when there are conflicts between the task and the knowledge contained in PLMs. To address the issue, based on Prompt paradigm, a novel dynamic Prompt-driven method is proposed, aimed at fully stimulating the knowledge from PLMs to promote relation extraction. Specifically, the task of zero-shot relation extraction is defined as a masked language model (MLM) task, where [MASK] representation is qualified as relation representation for classification. Further, the key problems of Prompt paradigm for zero-shot relation extraction are explored, including the effect of template in Prompt and representation degradation. On this basis, in the new model, we utilize dynamic template to provide greater flexibility and introduce contrastive learning to optimize the semantic representation. Extensive experiments are conducted on three benchmark datasets (FewRel, TACRED, and Wiki-ZSL), demonstrating that the proposed model achieves the state-of-the-art performance with solving the existing problems.
Loading