File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Neural Machine Translation with Bilingual History Involved Attention

TitleNeural Machine Translation with Bilingual History Involved Attention
Authors
KeywordsAttention mechanism
Bilingual history information
Neural machine translation
Issue Date2019
PublisherSpringer
Citation
8th CCF International Conference, NLPCC 2019, Dunhuang, China, October 9–14, 2019. In Tang, J, Kan, MY, Zhao, D, et al. (Eds), Natural Language Processing and Chinese Computing: 8th CCF International Conference, NLPCC 2019, Dunhuang, China, October 9–14, 2019, Proceedings, Part II, p. 265-275. Cham, Switzerland: Springer, 2019 How to Cite?
AbstractThe using of attention in neural machine translation (NMT) has greatly improved translation performance, but NMT models usually calculate attention vectors independently at different time steps and consequently suffer from over-translation and under-translation. To mitigate the problem, in this paper we propose a method to consider the translated source and target information up to now related to each source word when calculating attentions. The main idea is to keep track of the translated source and target information assigned to each source word at each time step and then accumulate these information to get the completion degree for each source word. In this way, in the later calculation of the attention, the model can adjust the attention weights to give a reasonable final completion degree for each source word. Experimental results show that our method can outperform the strong baseline systems significantly both on the Chinese-English and English-German translation tasks and produce better alignment on the human aligned data set.
Persistent Identifierhttp://hdl.handle.net/10722/312057
ISBN
ISSN
2023 SCImago Journal Rankings: 0.606
Series/Report no.Lecture Notes in Computer Science ; 11839
Lecture Notes in Computer Science. Lecture Notes in Artificial Intelligence
LNCS sublibrary. SL 7, Artificial Intelligence

 

DC FieldValueLanguage
dc.contributor.authorXue, Haiyang-
dc.contributor.authorFeng, Yang-
dc.contributor.authorYou, Di-
dc.contributor.authorZhang, Wen-
dc.contributor.authorLi, Jingyu-
dc.date.accessioned2022-04-06T04:32:05Z-
dc.date.available2022-04-06T04:32:05Z-
dc.date.issued2019-
dc.identifier.citation8th CCF International Conference, NLPCC 2019, Dunhuang, China, October 9–14, 2019. In Tang, J, Kan, MY, Zhao, D, et al. (Eds), Natural Language Processing and Chinese Computing: 8th CCF International Conference, NLPCC 2019, Dunhuang, China, October 9–14, 2019, Proceedings, Part II, p. 265-275. Cham, Switzerland: Springer, 2019-
dc.identifier.isbn9783030322359-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10722/312057-
dc.description.abstractThe using of attention in neural machine translation (NMT) has greatly improved translation performance, but NMT models usually calculate attention vectors independently at different time steps and consequently suffer from over-translation and under-translation. To mitigate the problem, in this paper we propose a method to consider the translated source and target information up to now related to each source word when calculating attentions. The main idea is to keep track of the translated source and target information assigned to each source word at each time step and then accumulate these information to get the completion degree for each source word. In this way, in the later calculation of the attention, the model can adjust the attention weights to give a reasonable final completion degree for each source word. Experimental results show that our method can outperform the strong baseline systems significantly both on the Chinese-English and English-German translation tasks and produce better alignment on the human aligned data set.-
dc.languageeng-
dc.publisherSpringer-
dc.relation.ispartofNatural Language Processing and Chinese Computing: 8th CCF International Conference, NLPCC 2019, Dunhuang, China, October 9–14, 2019, Proceedings, Part II-
dc.relation.ispartofseriesLecture Notes in Computer Science ; 11839-
dc.relation.ispartofseriesLecture Notes in Computer Science. Lecture Notes in Artificial Intelligence-
dc.relation.ispartofseriesLNCS sublibrary. SL 7, Artificial Intelligence-
dc.subjectAttention mechanism-
dc.subjectBilingual history information-
dc.subjectNeural machine translation-
dc.titleNeural Machine Translation with Bilingual History Involved Attention-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/978-3-030-32236-6_23-
dc.identifier.scopuseid_2-s2.0-85075829968-
dc.identifier.spage265-
dc.identifier.epage275-
dc.identifier.eissn1611-3349-
dc.publisher.placeCham, Switzerland-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats