PhraseAttn: Dynamic Slot Capsule Networks for phrase representation in Neural Machine TranslationDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 19 Jun 2023J. Intell. Fuzzy Syst. 2022Readers: Everyone
Abstract: Word representation plays a vital role in most Natural Language Processing systems, especially for Neural Machine Translation. It tends to capture semantic and similarity between individual words well, but struggle to represent the meaning of phrases or multi-word expressions. In this paper, we investigate a method to generate and use phrase information in a translation model. To generate phrase representations, a Primary Phrase Capsule network is first employed, then iteratively enhancing with a Slot Attention mechanism. Experiments on the IWSLT English to Vietnamese, French, and German datasets show that our proposed method consistently outperforms the baseline Transformer, and attains competitive results over the scaled Transformer with two times lower parameters.
0 Replies

Loading