Skip to main content Skip to main navigation

Publication

The Transference Architecture for Automatic Post-Editing

Santanu Pal; Hongfei Xu; Nico Herbig; Sudip Kumar Naskar; Antonio Krüger; Josef van Genabith
In: ArXiv e-prints, Vol. arXiv preprint arXiv:1908.06151, Pages 1-10, arxiv, 2019.

Abstract

In automatic post-editing (APE) it makes sense to condition post-editing (pe) decisions on both the source (src) and the machine translated text (mt) as input. This has led to multi-source encoder based APE approaches. A research challenge now is the search for architectures that best support the capture, preparation and provision of src and mt information and its integration with pe decisions. In this paper we present a new multi-source APE model, called transference. Unlike previous approaches, it (i) uses a transformer encoder block for src,(ii) followed by a decoder block, but without masking for self-attention on mt, which effectively acts as second encoder combining src-> mt, and (iii) feeds this representation into a final decoder block generating pe. Our model outperforms the state-of-the-art by 1 BLEU point on the WMT 2016, 2017, and 2018 English--German APE shared tasks (PBSMT and NMT). We further investigate the importance of our newly introduced second encoder and find that a too small amount of layers does hurt the performance, while reducing the number of layers of the decoder does not matter much.

Projekte

Weitere Links