Incorporating Syntactic Knowledge into Pre-trained Language Model using Optimization for Overcoming Catastrophic Forgetting
Submission Type: Regular Long Paper
Submission Track: Syntax, Parsing and their Applications
Submission Track 2: Efficient Methods for NLP
Keywords: syntax, BERT, language model, optimization, catastrophic forgetting
TL;DR: We explore additional training to incorporate syntactic knowledge to a language model while clearly avoiding catastrophic forgetting.
Abstract: Syntactic knowledge is invaluable information for many tasks which handle complex or long sentences, but typical pre-trained language models do not contain sufficient syntactic knowledge. Thus it results in failures in downstream tasks that require syntactic knowledge.
In this paper, we explore additional training to incorporate syntactic knowledge to a language model. We designed four pre-training tasks that learn different syntactic perspectives.
For adding new syntactic knowledge and keeping a good balance between the original and additional knowledge, we addressed the problem of catastrophic forgetting that prevents the model from keeping semantic information when the model learns additional syntactic knowledge. We demonstrated that additional syntactic training produced consistent performance gains while clearly avoiding catastrophic forgetting.
Submission Number: 55
Loading