Paper Link: https://openreview.net/forum?id=KkXjo4lo58K
Paper Type: Short paper (up to four pages of content + unlimited references and appendices)
Abstract: Lifelong event detection aims to incrementally update a model with new event types and data while retaining the capability of previously learned old types. One critical challenge is that the model would catastrophically forget old types when continually trained on new data. In this paper, we introduce \textbf{E}psodic \textbf{M}emory \textbf{P}rompts (\textbf{EMP}) to explicitly preserve the learned task-specific knowledge. Our method adopts continuous prompt for each task and they are optimized to instruct the model prediction and learn event-specific representation. The EMPs learned in previous tasks are carried along with the model in subsequent tasks, and can serve as a memory module that keeps the old knowledge and transferring to new tasks. Experiment results demonstrate the effectiveness of our method. Furthermore, we also conduct a comprehensive analysis of the new and old event types in lifelong learning.
0 Replies
Loading