Cheat Codes to Quantify Missing Source Information in Neural Machine TranslationDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=rlTZqr_Sg4j
Paper Type: Short paper (up to four pages of content + unlimited references and appendices)
Abstract: This paper describes a method to quantify the amount of information $H(t|s)$ added by the target sentence $t$ that is not present in the source $s$ in a neural machine translation system. We do this by providing the model the target sentence in a highly compressed form (a "cheat code"), and exploring the effect of the size of the cheat code. We find that the model is able to capture extra information from just a single float representation of the target and nearly reproduces the target with two 32-bit floats per target token.
Presentation Mode: This paper will be presented in person in Seattle
Copyright Consent Signature (type Name Or NA If Not Transferrable): Proyag Pal
Copyright Consent Name And Address: University of Edinburgh, 10 Crichton Street, Edinburgh EH8 9LE, United Kingdom
0 Replies

Loading