2019.iwslt-1.13@ACL

Total: 1

#1 KIT’s Submission to the IWSLT 2019 Shared Task on Text Translation [PDF] [Copy] [Kimi1]

Authors: Felix Schneider ; Alex Waibel

In this paper, we describe KIT’s submission for the IWSLT 2019 shared task on text translation. Our system is based on the transformer model [1] using our in-house implementation. We augment the available training data using back-translation and employ fine-tuning for the final model. For our best results, we used a 12-layer transformer-big config- uration, achieving state-of-the-art results on the WMT2018 test set. We also experiment with student-teacher models to improve performance of smaller models.