Character-level Translation with Self-attentionDownload PDF

09 Dec 2019 (modified: 30 Apr 2020)OpenReview Anonymous Preprint Blind SubmissionReaders: Everyone
Keywords: natural language processing, machine translation, character-level models
TL;DR: We perform an in-depth investigation of the suitability of self-attention models for character-level neural machine translation.
Abstract: We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level and converges faster while learning more robust character-level alignments.
0 Replies

Loading