Simply Trainable Nearest Neighbour Machine Translation with GPU InferenceDownload PDF

Anonymous

16 Oct 2023ACL ARR 2023 October Blind SubmissionReaders: Everyone
Abstract: Nearest neighbor machine translation is a successful approach for fast domain adaption, which interpolates the pre-trained transformers with domain-specific token-level k-nearest-neighbor (kNN) retrieval without retraining. Despite kNN MT's success, searching large reference corpus and fixed interpolation between the kNN and pre-trained model led to computational complexity and translation quality challenges. Among other papers, proposed methods to obtain a small number of reference samples dynamically for which they introduced a distance-aware interpolation method using an equation that includes free parameters. In this paper, we propose a simply trainable nearest neighbor machine translation and carry out inference experiments on GPU. In specific, we first adaptively construct a small datastore for each input sentence. Second, we train the interpolation coefficient between the knnMT and pre-trained result to automatically interpolate in different domains. Experimental results on different domains show that our proposed method at least maintains the translation quality of other methods in the literature while being automatic. In addition, our GPU inference results demonstrate that knnMT can be integrated into GPUs with a drop of only 5% in terms of speed.
Paper Type: short
Research Area: Machine Translation
Contribution Types: NLP engineering experiment
Languages Studied: English, German, Czech
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies

Loading