当前位置: X-MOL 学术J. Comput. Sci. Tech. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Language Adaptation for Entity Relation Classification via Adversarial Neural Networks
Journal of Computer Science and Technology ( IF 1.9 ) Pub Date : 2021-01-30 , DOI: 10.1007/s11390-020-9713-0
Bo-Wei Zou , Rong-Tao Huang , Zeng-Zhuang Xu , Yu Hong , Guo-Dong Zhou

Entity relation classification aims to classify the semantic relationship between two marked entities in a given sentence, and plays a vital role in various natural language processing applications. However, existing studies focus on exploiting mono-lingual data in English, due to the lack of labeled data in other languages. How to effectively benefit from a richly-labeled language to help a poorly-labeled language is still an open problem. In this paper, we come up with a language adaptation framework for cross-lingual entity relation classification. The basic idea is to employ adversarial neural networks (AdvNN) to transfer feature representations from one language to another. Especially, such a language adaptation framework enables feature imitation via the competition between a sentence encoder and a rival language discriminator to generate effective representations. To verify the effectiveness of AdvNN, we introduce two kinds of adversarial structures, dual-channel AdvNN and single-channel AdvNN. Experimental results on the ACE 2005 multilingual training corpus show that our single-channel AdvNN achieves the best performance on both unsupervised and semi-supervised scenarios, yield- ing an improvement of 6.61% and 2.98% over the state-of-the-art, respectively. Compared with baselines which directly adopt a machine translation module, we find that both dual-channel and single-channel AdvNN significantly improve the performances (F1) of cross-lingual entity relation classification. Moreover, extensive analysis and discussion demonstrate the appropriateness and effectiveness of different parameter settings in our language adaptation framework.



中文翻译:

通过对抗神经网络进行实体关系分类的语言适应

实体关系分类旨在对给定句子中两个标记实体之间的语义关系进行分类,并且在各种自然语言处理应用程序中发挥着至关重要的作用。但是,由于缺乏其他语言的标签数据,现有的研究集中在利用英语的单语数据上。如何有效地从标记丰富的语言中受益以帮助标记较差的语言仍然是一个悬而未决的问题。在本文中,我们提出了一种用于跨语言实体关系分类的语言适应框架。基本思想是采用对抗神经网络(AdvNN)将特征表示从一种语言转换为另一种语言。特别,这样的语言适应框架通过句子编码器和竞争语言鉴别器之间的竞争使得特征模仿能够产生有效的表示。为了验证AdvNN的有效性,我们引入了两种对抗结构,双通道AdvNN和单通道AdvNN。ACE 2005多语种训练语料库的实验结果表明,我们的单通道AdvNN在无监督和半监督情况下均达到最佳性能,与最新技术相比,分别提高了6.61%和2.98%,分别。与直接采用机器翻译模块的基准相比,我们发现双通道和单通道AdvNN均可显着提高性能(我们介绍两种对抗结构,双通道AdvNN和单通道AdvNN。ACE 2005多语种训练语料库的实验结果表明,我们的单通道AdvNN在无监督和半监督情况下均达到最佳性能,与最新技术相比,分别提高了6.61%和2.98%,分别。与直接采用机器翻译模块的基准相比,我们发现双通道和单通道AdvNN均可显着提高性能(我们介绍两种对抗结构,双通道AdvNN和单通道AdvNN。ACE 2005多语种训练语料库的实验结果表明,我们的单通道AdvNN在无监督和半监督情况下均达到最佳性能,与最新技术相比,分别提高了6.61%和2.98%,分别。与直接采用机器翻译模块的基准相比,我们发现双通道和单通道AdvNN均可显着提高性能(F 1)跨语言实体关系分类。此外,大量的分析和讨论证明了我们的语言适应框架中不同参数设置的适当性和有效性。

更新日期:2021-02-07
down
wechat
bug