AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
Article Link
Collect
Submit Manuscript
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Regular Paper

Document-Level Neural Machine Translation with Hierarchical Modeling of Global Context

School of Computer Science and Technology, Soochow University, Suzhou 215006, China

A preliminary version of the paper was published in the Proceedings of EMNLP-IJCNLP 2019.

Show Author Information

Abstract

Document-level machine translation (MT) remains challenging due to its difficulty in efficiently using document-level global context for translation. In this paper, we propose a hierarchical model to learn the global context for document-level neural machine translation (NMT). This is done through a sentence encoder to capture intra-sentence dependencies and a document encoder to model document-level inter-sentence consistency and coherence. With this hierarchical architecture, we feedback the extracted document-level global context to each word in a top-down fashion to distinguish different translations of a word according to its specific surrounding context. Notably, we explore the effect of three popular attention functions during the information backward-distribution phase to take a deep look into the global context information distribution of our model. In addition, since large-scale in-domain document-level parallel corpora are usually unavailable, we use a two-step training strategy to take advantage of a large-scale corpus with out-of-domain parallel sentence pairs and a small-scale corpus with in-domain parallel document pairs to achieve the domain adaptability. Experimental results of our model on Chinese-English and English-German corpora significantly improve the Transformer baseline by 4.5 BLEU points on average which demonstrates the effectiveness of our proposed hierarchical model in document-level NMT.

Electronic Supplementary Material

Download File(s)
jcst-37-2-295-Highlights.pdf (129.4 KB)

References

【1】
【1】
 
 
Journal of Computer Science and Technology
Pages 295-308

{{item.num}}

Comments on this article

Go to comment

< Back to all reports

Review Status: {{reviewData.commendedNum}} Commended , {{reviewData.revisionRequiredNum}} Revision Required , {{reviewData.notCommendedNum}} Not Commended Under Peer Review

Review Comment

Close
Close
Cite this article:
Tan X, Zhang L-Y, Zhou G-D. Document-Level Neural Machine Translation with Hierarchical Modeling of Global Context. Journal of Computer Science and Technology, 2022, 37(2): 295-308. https://doi.org/10.1007/s11390-021-0286-3

827

Views

4

Crossref

5

Web of Science

10

Scopus

0

CSCD

Received: 09 March 2020
Accepted: 11 January 2021
Published: 31 March 2022
©Institute of Computing Technology, Chinese Academy of Sciences 2022