File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: A teacher-student framework for zero-resource neural machine translation

TitleA teacher-student framework for zero-resource neural machine translation
Authors
Issue Date2017
PublisherAssociation for Computational Linguistics. The Proceedings' web site is located at http://aclweb.org/anthology/D/D17/#1000
Citation
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL), Vancouver, Canada, 30 July - 4 August 2017, v. 1: Long Papers, p. 1925-1935 How to Cite?
AbstractWhile end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on the assumption, our method is able to train a source-to-target NMT model (“student”) without parallel corpora available guided by an existing pivot-to-target NMT model (“teacher”) on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs.
Persistent Identifierhttp://hdl.handle.net/10722/262432

 

DC FieldValueLanguage
dc.contributor.authorChen, Y-
dc.contributor.authorLiu, Y-
dc.contributor.authorCheng, Y-
dc.contributor.authorLi, VOK-
dc.date.accessioned2018-09-28T04:59:14Z-
dc.date.available2018-09-28T04:59:14Z-
dc.date.issued2017-
dc.identifier.citationProceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL), Vancouver, Canada, 30 July - 4 August 2017, v. 1: Long Papers, p. 1925-1935-
dc.identifier.urihttp://hdl.handle.net/10722/262432-
dc.description.abstractWhile end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on the assumption, our method is able to train a source-to-target NMT model (“student”) without parallel corpora available guided by an existing pivot-to-target NMT model (“teacher”) on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs.-
dc.languageeng-
dc.publisherAssociation for Computational Linguistics. The Proceedings' web site is located at http://aclweb.org/anthology/D/D17/#1000-
dc.relation.ispartofProceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL)-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.titleA teacher-student framework for zero-resource neural machine translation-
dc.typeConference_Paper-
dc.identifier.emailLi, VOK: vli@eee.hku.hk-
dc.identifier.authorityLi, VOK=rp00150-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.18653/v1/P17-1176-
dc.identifier.hkuros292194-
dc.identifier.volume1: Long Papers-
dc.identifier.spage1925-
dc.identifier.epage1935-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats