Publikation

Analysing Coreference in Transformer Outputs

Ekaterina Lapshinova-Koltunski, Cristina España-Bonet, Josef van Genabith

In: Fourth Workshop on Discourse in Machine Translation. Discourse in Machine Translation (DiscoMT-2019) befindet sich EMNLP-IJCNLP 2019 November 3 Hong Kong China Seiten 1-12 ACL 11/2019.

Abstrakt

We analyse coreference phenomena in three neural machine translation systems trained with different data settings with or without access to explicit intra- and cross-sentential anaphoric information. We compare system performance on two different genres: news and TED talks. To do this, we manually annotate (the possibly incorrect) coreference chains in the MT outputs and evaluate the coreference chain translations. We define an error typology that aims to go further than pronoun translation adequacy and includes types such as incorrect word selection or missing words. The features of coreference chains in automatic translations are also compared to those of the source texts and human translations. The analysis shows stronger potential translationese effects in machine translated outputs than in human translations.

Projekte

Weitere Links

coref_DiscoMT.pdf (pdf, 443 KB)

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence