当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Exploiting Dependency Information to Improve Biomedical Event Detection via Gated Polar Attention Mechanism
Neurocomputing ( IF 5.5 ) Pub Date : 2021-01-01 , DOI: 10.1016/j.neucom.2020.09.020
Lishuang Li , Beibei Zhang

Abstract This paper tackles the task of biomedical event detection, which includes identifying and categorizing biomedical event triggers. We find that the current biomedical event detection models driven by dependency fail to benefit more distinct improvement from the existing manual dependency embeddings. Here an interpretable hypothesis for the problem above is, that the model using manual dependency embeddings may suffer from low dependency information density (named as dependency weakness) and diffusion of noises from sparse dependency items (called as sparsity diffusion). We argue that dependency representation learning is more effective than the existing manual dependency embeddings, which can reduce dependency weakness and sparsity diffusion. In this work, we first confirm the hypothesis above and then propose to explicitly apply dependency representation learning and triple context representation learning for the biomedical event detection task via gated polar attention mechanism. In specific, we systematically investigate our model under the gated polar attention mechanism. Experimental results demonstrate that our approach outperforms the recent state-of-the-art methods and achieves the best F-score on the biomedical benchmark MLEE dataset.

中文翻译:

通过门控极地注意力机制利用依赖信息来改进生物医学事件检测

摘要 本文解决了生物医学事件检测的任务,其中包括识别和分类生物医学事件触发器。我们发现,当前由依赖驱动的生物医学事件检测模型无法从现有的手动依赖嵌入中获得更明显的改进。对于上述问题,这里一个可解释的假设是,使用手动依赖嵌入的模型可能会受到低依赖信息密度(称为依赖弱点)和来自稀疏依赖项的噪声扩散(称为稀疏扩散)的影响。我们认为依赖表示学习比现有的手动依赖嵌入更有效,可以减少依赖弱点和稀疏扩散。在这项工作中,我们首先确认了上述假设,然后建议通过门控极性注意机制将依赖表示学习和三重上下文表示学习明确应用于生物医学事件检测任务。具体来说,我们在门控极地注意力机制下系统地研究了我们的模型。实验结果表明,我们的方法优于最近最先进的方法,并在生物医学基准 MLEE 数据集上获得了最佳 F 分数。
更新日期:2021-01-01
down
wechat
bug