当前位置: X-MOL 学术Int. J. Artif. Intell. Tools › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Position Weighted Information Based Word Embedding Model for Machine Translation
International Journal on Artificial Intelligence Tools ( IF 1.0 ) Pub Date : 2020-11-30 , DOI: 10.1142/s0218213020400059
Zhen Li 1 , Dan Qu 1 , Yanxia Li 2 , Chaojie Xie 3 , Qi Chen 1
Affiliation  

Deep learning technology promotes the development of neural network machine translation (NMT). End-to-End (E2E) has become the mainstream in NMT. It uses word vectors as the initial value of the input layer. The effect of word vector model directly affects the accuracy of E2E-NMT. Researchers have proposed many approaches to learn word representations and have achieved significant results. However, the drawbacks of these methods still limit the performance of E2E-NMT systems. This paper focuses on the word embedding technology and proposes the PW-CBOW word vector model which can present better semantic information. We apply these word vector models on IWSLT14 German-English, WMT14 English-German, WMT14 English-French corporas. The results evaluate the performance of the PW-CBOW model. In the latest E2E-NMT systems, the PW-CBOW word vector model can improve the performance.

中文翻译:

一种基于位置加权信息的机器翻译词嵌入模型

深度学习技术促进了神经网络机器翻译(NMT)的发展。端到端(E2E)已成为 NMT 的主流。它使用词向量作为输入层的初始值。词向量模型的效果直接影响 E2E-NMT 的准确性。研究人员提出了许多学习单词表示的方法,并取得了显著成果。然而,这些方法的缺点仍然限制了 E2E-NMT 系统的性能。本文重点关注词嵌入技术,提出了能够更好地呈现语义信息的PW-CBOW词向量模型。我们将这些词向量模型应用于 IWSLT14 德语-英语、WMT14 英语-德语、WMT14 英语-法语语料库。结果评估了 PW-CBOW 模型的性能。在最新的 E2E-NMT 系统中,
更新日期:2020-11-30
down
wechat
bug