A New Syntactic Metric for Evaluation of Machine TranslationDownload PDF

2013 (modified: 16 Jul 2019)ACL (Student Research Workshop) 2013Readers: Everyone
Abstract: Machine translation (MT) evaluation aims at measuring the quality of a candidate translation by comparing it with a reference translation. This comparison can be performed on multiple levels: lexical, syntactic or semantic. In this paper, we propose a new syntactic metric for MT evaluation based on the comparison of the dependency structures of the reference and the candidate translations. The dependency structures are obtained by means of a Weighted Constraints Dependency Grammar parser. Based on experiments performed on English to German translations, we show that the new metric correlates well with human judgments at the system level.
0 Replies

Loading