当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-Task Attentive Residual Networks for Argument Mining
arXiv - CS - Computation and Language Pub Date : 2021-02-24 , DOI: arxiv-2102.12227
Andrea Galassi, Marco Lippi, Paolo Torroni

We explore the use of residual networks and neural attention for argument mining and in particular link prediction. The method we propose makes no assumptions on document or argument structure. We propose a residual architecture that exploits attention, multi-task learning, and makes use of ensemble. We evaluate it on a challenging data set consisting of user-generated comments, as well as on two other datasets consisting of scientific publications. On the user-generated content dataset, our model outperforms state-of-the-art methods that rely on domain knowledge. On the scientific literature datasets it achieves results comparable to those yielded by BERT-based approaches but with a much smaller model size.

中文翻译:

用于参数挖掘的多任务专注残差网络

我们探索了使用残差网络和神经注意力进行参数挖掘,尤其是链接预测。我们提出的方法没有对文档或参数结构进行任何假设。我们提出了一种残差架构,该架构利用注意力,多任务学习并利用整体。我们在具有挑战性的数据集(包括用户生成的评论)以及其他两个由科学出版物组成的数据集上进行评估。在用户生成的内容数据集上,我们的模型优于依靠领域知识的最新方法。在科学文献数据集上,它获得的结果与基于BERT的方法所产生的结果可比,但模型尺寸要小得多。
更新日期:2021-02-25
down
wechat
bug