当前位置: X-MOL 学术J. Visual Commun. Image Represent. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Neural machine translation with Gumbel Tree-LSTM based encoder
Journal of Visual Communication and Image Representation ( IF 2.6 ) Pub Date : 2020-04-13 , DOI: 10.1016/j.jvcir.2020.102811
Chao Su , Heyan Huang , Shumin Shi , Ping Jian , Xuewen Shi

Neural machine translation has improved the translation accuracy greatly and received great attention of the machine translation community. Tree-based translation models aim to model the syntactic or semantic relation among long-distance words or phrases in a sentence. However, it faces the difficulties of expensive manual annotation cost and poor automatic annotation accuracy. In this paper, we focus on how to encode a source sentence into a vector in a unsupervised-tree way and then decode it into a target sentence. Our model incorporates Gumbel Tree-LSTM, which can learn how to compose tree structures from plain text without any tree annotation. We evaluate the proposed model on both spoken and news corpora, and show that the performance of our proposed model outperforms the attentional seq2seq model and the Transformer base model.



中文翻译:

使用基于Gumbel Tree-LSTM的编码器进行神经机器翻译

神经机器翻译极大地提高了翻译准确性,并受到了机器翻译界的广泛关注。基于树的翻译模型旨在模拟句子中长途单词或短语之间的句法或语义关系。但是,它面临着昂贵的人工注释成本和较差的自动注释精度的困难。在本文中,我们集中于如何以无监督树的方式将源句子编码为向量,然后将其解码为目标句子。我们的模型包含Gumbel Tree-LSTM,它可以学习如何从纯文本组成树结构而无需任何树注释。我们在语音语料库和新闻语料库上评估了所提出的模型,并表明所提出模型的性能优于注意力seq2seq模型和Transformer基本模型。

更新日期:2020-04-13
down
wechat
bug