Non-Autoregressive Neural Machine Translation with Consistency Regularization Optimized Variational FrameworkDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Variational Autoencoder (VAE) is an effective way to model the interdependency for Non-autoregressive neural machine translation (NAT). LaNMT, a representative VAE-based latent-variable NAT framework achieves great improvements to vanilla models, but still suffers from two main issues which lower down the translation quality: (1) mismatch between training and inference circumstances and (2) inadequacy of latent representations. In this work, we target on addressing these issues by proposing posterior consistency regularization. Specifically, we first apply stochastic data augmentation on the input samples to better adapt the model for inference circumstance, and then perform consistency training on posterior latent variables to train a more robust posterior network with better latent representations. Experiments on En-De/De-En/En-Ro benchmarks confirm the effectiveness of our methods with about 1.3/0.7/0.8 BLEU points improvement to the baseline model with about $12.6\times$ faster than autoregressive Transformer.
Paper Type: long
0 Replies

Loading