Measuring and Reducing Model Update Regression in Structured Prediction for NLPDownload PDF

Published: 31 Oct 2022, Last Modified: 11 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: Model Update Regression, Structured Prediction, Backward Compatibility
Abstract: Recent advance in deep learning has led to rapid adoption of machine learning based NLP models in a wide range of applications. Despite the continuous gain in accuracy, backward compatibility is also an important aspect for industrial applications, yet it received little research attention. Backward compatibility requires that the new model does not regress on cases that were correctly handled by its predecessor. This work studies model update regression in structured prediction tasks. We choose syntactic dependency parsing and conversational semantic parsing as representative examples of structured prediction tasks in NLP. First, we measure and analyze model update regression in different model update settings. Next, we explore and benchmark existing techniques for reducing model update regression including model ensemble and knowledge distillation. We further propose a simple and effective method, Backward-Congruent Re-ranking (BCR), by taking into account the characteristics of structured output. Experiments show that BCR can better mitigate model update regression than model ensemble and knowledge distillation approaches.
TL;DR: We find that model update regression is a severe and widespread problem in NLP structured prediction and explore several mitigation methods including a novel, simple, and effective approach called backward-congruent reranking.
Supplementary Material: pdf
23 Replies

Loading