当前位置: X-MOL 学术Neural Comput. & Applic. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Syntax-aware neural machine translation directed by syntactic dependency degree
Neural Computing and Applications ( IF 4.5 ) Pub Date : 2021-07-02 , DOI: 10.1007/s00521-021-06256-4
Ru Peng 1 , Yi Fang 1 , Tianyong Hao 2
Affiliation  

There are various ways to incorporate syntax knowledge into neural machine translation (NMT). However, quantifying the dependency syntactic intimacy (DSI) between word pairs in a dependency tree has not being considered to use in attentional and transformer-based NMT. In this paper, we innovatively propose a variant of Tree-LSTM to capture the syntactic dependency degree (SDD) between word pairs in dependency trees. Two syntax-aware distances, including a tuned syntax distance and a \(\varvec{\rho }\)-dependent distance, are proposed. For attentional NMT, two syntax-aware attentions based on two syntax-aware distances are proposed for attentional NMT, and we also design a dual attention to simultaneously generate global context and dependency syntactic context. For transformer-based NMT, we explicitly incorporate the dependency syntax into self-attention network (SAN) to propose a syntax-aware SAN. Experiments on IWSLT’17 English–German, IWSLT Chinese–English and WMT’15 English–Finnish translation tasks show that our syntax-aware NMT significantly improves translation quality by comparing with baseline methods, even the state-of-the-art transformer-based NMT.



中文翻译:

基于句法依赖度的句法感知神经机器翻译

有多种方法可以将语法知识整合到神经机器翻译 (NMT) 中。然而,量化依赖树中词对之间的依赖句法亲密度 (DSI) 尚未被考虑用于注意力和基于变换器的 NMT。在本文中,我们创新地提出了一种 Tree-LSTM 的变体来捕获依赖树中词对之间的句法依赖度(SDD)。两个语法感知距离,包括调整的语法距离和\(\varvec{\rho }\)依赖距离,建议。对于注意力 NMT,基于两个句法感知距离的两个句法感知注意力被提出用于注意力 NMT,我们还设计了一个双重注意力来同时生成全局上下文和依赖句法上下文。对于基于转换器的 NMT,我们明确地将依赖句法合并到自注意力网络 (SAN) 中以提出句法感知 SAN。IWSLT'17 英德、IWSLT 汉英和 WMT'15 英芬翻译任务的实验表明,与基线方法相比,我们的语法感知 NMT 显着提高了翻译质量,即使是最先进的转换器 -基于 NMT。

更新日期:2021-07-04
down
wechat
bug