Continual BERT: Continual Learning for Adaptive Extractive Summarization of COVID-19 LiteratureDownload PDF

01 Jul 2020 (modified: 03 Jul 2024)Submitted to NLP-COVID-2020Readers: Everyone
TL;DR: A novel BERT architecture to continually learn on new COVID-19 literature for extractive summarization while minimizing catastrophic forgetting
Abstract: The scientific community continues to publish an overwhelming amount of new research related to COVID-19 on a daily basis, leading to many literature without little to no attention. To aid the community in understanding the rapidly flowing array of COVID-19 literature, we propose a novel BERT architecture that provides a brief yet original summarization of lengthy papers. The model continually learns on new data in online fashion while minimizing catastrophic forgetting, thus fitting to the need of the community. Benchmark and manual examination of its performance shows that the model provide a sound summary on new scientific literature.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/continual-bert-continual-learning-for/code)
0 Replies

Loading