当前位置:
X-MOL 学术
›
arXiv.cs.CL
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Multitask Pointer Network for Multi-Representational Parsing
arXiv - CS - Computation and Language Pub Date : 2020-09-21 , DOI: arxiv-2009.09730 Daniel Fern\'andez-Gonz\'alez and Carlos G\'omez-Rodr\'iguez
arXiv - CS - Computation and Language Pub Date : 2020-09-21 , DOI: arxiv-2009.09730 Daniel Fern\'andez-Gonz\'alez and Carlos G\'omez-Rodr\'iguez
We propose a transition-based approach that, by training a single model, can
efficiently parse any input sentence with both constituent and dependency
trees, supporting both continuous/projective and discontinuous/non-projective
syntactic structures. To that end, we develop a Pointer Network architecture
with two separate task-specific decoders and a common encoder, and follow a
multitask learning strategy to jointly train them. The resulting quadratic
system, not only becomes the first parser that can jointly produce both
unrestricted constituent and dependency trees from a single model, but also
proves that both syntactic formalisms can benefit from each other during
training, achieving state-of-the-art accuracies in several widely-used
benchmarks such as the continuous English and Chinese Penn Treebanks, as well
as the discontinuous German NEGRA and TIGER datasets.
中文翻译:
用于多表示解析的多任务指针网络
我们提出了一种基于转换的方法,通过训练单个模型,可以有效地解析具有成分树和依赖树的任何输入句子,支持连续/投影和不连续/非投影句法结构。为此,我们开发了一个指针网络架构,它具有两个单独的特定任务解码器和一个公共编码器,并遵循多任务学习策略来联合训练它们。由此产生的二次系统,不仅成为第一个可以从单个模型中联合生成无限制成分树和依赖树的解析器,而且证明了两种句法形式在训练过程中可以相互受益,达到最先进的精度在几个广泛使用的基准测试中,例如连续的英文和中文 Penn Treebanks,
更新日期:2020-09-22
中文翻译:
用于多表示解析的多任务指针网络
我们提出了一种基于转换的方法,通过训练单个模型,可以有效地解析具有成分树和依赖树的任何输入句子,支持连续/投影和不连续/非投影句法结构。为此,我们开发了一个指针网络架构,它具有两个单独的特定任务解码器和一个公共编码器,并遵循多任务学习策略来联合训练它们。由此产生的二次系统,不仅成为第一个可以从单个模型中联合生成无限制成分树和依赖树的解析器,而且证明了两种句法形式在训练过程中可以相互受益,达到最先进的精度在几个广泛使用的基准测试中,例如连续的英文和中文 Penn Treebanks,