当前位置: X-MOL 学术Inf. Process. Manag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Enhanced prototypical network for few-shot relation extraction
Information Processing & Management ( IF 8.6 ) Pub Date : 2021-04-07 , DOI: 10.1016/j.ipm.2021.102596
Wen Wen , Yongbin Liu , Chunping Ouyang , Qiang Lin , Tonglee Chung

Most existing methods for relation extraction tasks depend heavily on large-scale annotated data; they cannot learn from existing knowledge and have low generalization ability. It is urgent for us to solve the above problems by further developing few-shot learning methods. Because of the limitations of the most commonly used CNN model which is not good at sequence labeling and capturing long-range dependencies, we proposed a novel model that integrates the transformer model into a prototypical network for more powerful relation-level feature extraction. The transformer connects tokens directly to adapt to long sequence learning without catastrophic forgetting and is able to gain more enhanced semantic information by learning from several representation subspaces in parallel for each word. We evaluate our method on three tasks, including in-domain, cross-domain and cross-sentence tasks. Our method achieves a trade-off between performance and computation and has an approximately 8% improvement in different settings over the state-of-the-art prototypical network. In addition, our experiments also show that our approach is competitive when considering cross-domain transfer and cross-sentence relation extraction in few-shot learning methods.



中文翻译:

增强的原型网络,可进行多次关联提取

关系提取任务的大多数现有方法在很大程度上依赖于大规模带注释的数据。他们无法从现有知识中学习,并且泛化能力很低。对于我们来说,迫切需要通过进一步开发少拍学习方法来解决上述问题。由于最常用的CNN模型的局限性,即它不擅长序列标记和捕获远程依赖关系,因此我们提出了一种新颖的模型,该模型将变换器模型集成到原型网络中,以提供更强大的关系级特征提取。转换器直接连接令牌以适应长序列学习,而不会造成灾难性的遗忘,并且可以通过从每个单词并行的几个表示子空间中学习来获得更多增强的语义信息。我们评估了三种方法的方法,包括域内,跨域和跨句任务。我们的方法实现了性能和计算之间的折衷,并且在不同的设置下,相对于最新的原型网络,性能提高了约8%。此外,我们的实验还表明,在少数镜头学习方法中考虑跨域转移和跨句关系提取时,我们的方法具有竞争力。

更新日期:2021-04-08
down
wechat
bug