Continual BERT: Continual Learning for Adaptive Extractive Summarization of COVID-19 LiteratureDownload PDF

12 Aug 2020 (modified: 24 May 2023)Submitted to NLP-COVID19-EMNLPReaders: Everyone
TL;DR: A novel BERT architecture to continually learn on new COVID-19 literature for extractive summarization while minimizing catastrophic forgetting
Abstract: The scientific community continues to publish an overwhelming amount of new research related to COVID-19 on a daily basis, leading to many literature without little to no attention. To aid the community in understanding the rapidly flowing array of COVID-19 literature, we propose a novel BERT architecture that provides a brief yet original summarization of lengthy papers. The model continually learns on new data in online fashion while minimizing catastrophic forgetting, thus fitting to the need of the community. Benchmark and manual examination of its performance shows that the model provide a sound summary on new scientific literature.
6 Replies

Loading