当前位置: X-MOL 学术arXiv.cs.CL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
"When they say weed causes depression, but it's your fav antidepressant": Knowledge-aware Attention Framework for Relationship Extraction
arXiv - CS - Computation and Language Pub Date : 2020-09-21 , DOI: arxiv-2009.10155
Shweta Yadav, Usha Lokala, Raminta Daniulaityte, Krishnaprasad Thirunarayan, Francois Lamy, Amit Sheth

With the increasing legalization of medical and recreational use of cannabis, more research is needed to understand the association between depression and consumer behavior related to cannabis consumption. Big social media data has potential to provide deeper insights about these associations to public health analysts. In this interdisciplinary study, we demonstrate the value of incorporating domain-specific knowledge in the learning process to identify the relationships between cannabis use and depression. We develop an end-to-end knowledge infused deep learning framework (Gated-K-BERT) that leverages the pre-trained BERT language representation model and domain-specific declarative knowledge source (Drug Abuse Ontology (DAO)) to jointly extract entities and their relationship using gated fusion sharing mechanism. Our model is further tailored to provide more focus to the entities mention in the sentence through entity-position aware attention layer, where ontology is used to locate the target entities position. Experimental results show that inclusion of the knowledge-aware attentive representation in association with BERT can extract the cannabis-depression relationship with better coverage in comparison to the state-of-the-art relation extractor.

中文翻译:

“当他们说杂草会导致抑郁时,但它是您最喜欢的抗抑郁药”:关系提取的知识意识注意力框架

随着大麻的医疗和娱乐用途日益合法化,需要更多的研究来了解抑郁症与与大麻消费相关的消费者行为之间的关联。大型社交媒体数据有可能为公共卫生分析师提供有关这些关联的更深入见解。在这项跨学科研究中,我们展示了在学习过程中结合特定领域知识以确定大麻使用与抑郁症之间关系的价值。我们开发了一个端到端的知识注入深度学习框架 (Gated-K-BERT),它利用预训练的 BERT 语言表示模型和特定领域的声明性知识源(药物滥用本体(DAO))来联合提取实体和他们的关系使用门控融合共享机制。我们的模型经过进一步定制,通过实体位置感知注意力层更加关注句子中提到的实体,其中本体用于定位目标实体的位置。实验结果表明,与最先进的关系提取器相比,包含与 BERT 相关的知识感知注意力表示可以以更好的覆盖率提取大麻抑郁关系。
更新日期:2020-09-23
down
wechat
bug