In this paper, we introduce University of Tsukuba’s submission to the IWSLT20 Open Domain Translation Task. We participate in both Chinese→Japanese and Japanese→Chinese directions. For both directions, our machine translation systems are based on the Transformer architecture. Several techniques are integrated in order to boost the performance of our models: data filtering, large-scale noised training, model ensemble, reranking and postprocessing. Consequently, our efforts achieve 33.0 BLEU scores for Chinese→Japanese translation and 32.3 BLEU scores for Japanese→Chinese translation.
@inproceedings{cui-etal-2020-university, title = "{U}niversity of {T}sukuba`s Machine Translation System for {IWSLT}20 Open Domain Translation Task", author = "Cui, Hongyi and Wei, Yizhen and Iida, Shohei and Utsuro, Takehito and Nagata, Masaaki", editor = {Federico, Marcello and Waibel, Alex and Knight, Kevin and Nakamura, Satoshi and Ney, Hermann and Niehues, Jan and St{\"u}ker, Sebastian and Wu, Dekai and Mariani, Joseph and Yvon, Francois}, booktitle = "Proceedings of the 17th International Conference on Spoken Language Translation", month = jul, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2020.iwslt-1.17/", doi = "10.18653/v1/2020.iwslt-1.17", pages = "145--148", abstract = "In this paper, we introduce University of Tsukuba`s submission to the IWSLT20 Open Domain Translation Task. We participate in both Chinese{\textrightarrow}Japanese and Japanese{\textrightarrow}Chinese directions. For both directions, our machine translation systems are based on the Transformer architecture. Several techniques are integrated in order to boost the performance of our models: data filtering, large-scale noised training, model ensemble, reranking and postprocessing. Consequently, our efforts achieve 33.0 BLEU scores for Chinese{\textrightarrow}Japanese translation and 32.3 BLEU scores for Japanese{\textrightarrow}Chinese translation."}
%0 Conference Proceedings%T University of Tsukuba‘s Machine Translation System for IWSLT20 Open Domain Translation Task%A Cui, Hongyi%A Wei, Yizhen%A Iida, Shohei%A Utsuro, Takehito%A Nagata, Masaaki%Y Federico, Marcello%Y Waibel, Alex%Y Knight, Kevin%Y Nakamura, Satoshi%Y Ney, Hermann%Y Niehues, Jan%Y Stüker, Sebastian%Y Wu, Dekai%Y Mariani, Joseph%Y Yvon, Francois%S Proceedings of the 17th International Conference on Spoken Language Translation%D 2020%8 July%I Association for Computational Linguistics%C Online%F cui-etal-2020-university%X In this paper, we introduce University of Tsukuba‘s submission to the IWSLT20 Open Domain Translation Task. We participate in both Chinese→Japanese and Japanese→Chinese directions. For both directions, our machine translation systems are based on the Transformer architecture. Several techniques are integrated in order to boost the performance of our models: data filtering, large-scale noised training, model ensemble, reranking and postprocessing. Consequently, our efforts achieve 33.0 BLEU scores for Chinese→Japanese translation and 32.3 BLEU scores for Japanese→Chinese translation.%R 10.18653/v1/2020.iwslt-1.17%U https://aclanthology.org/2020.iwslt-1.17/%U https://doi.org/10.18653/v1/2020.iwslt-1.17%P 145-148
[University of Tsukuba’s Machine Translation System for IWSLT20 Open Domain Translation Task](https://aclanthology.org/2020.iwslt-1.17/) (Cui et al., IWSLT 2020)