当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Enhancing Chinese Character Representation With Lattice-Aligned Attention
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.4 ) Pub Date : 2021-10-05 , DOI: 10.1109/tnnls.2021.3114378
Shan Zhao , Minghao Hu , Zhiping Cai , Zhanjun Zhang , Tongqing Zhou , Fang Liu

Word–character lattice models have been proved to be effective for some Chinese natural language processing (NLP) tasks, in which word boundary information is fused into character sequences. However, due to the inherently unidirectional sequential nature, prior approaches have only learned sequential interactions of character–word instances but fail to capture fine-grained correlations in word–character spaces. In this article, we propose a lattice-aligned attention network (LAN) that aims to model dense interactions over word–character lattice structure for enhancing character representations. By carefully combining cross-lattice module, gated word–character semantic fusion unit, and self-lattice attention module, the network can explicitly capture fine-grained correlations across different spaces (e.g., word-to-character and character-to-character), thus significantly improving model performance. Experimental results on three Chinese NLP benchmark tasks demonstrate that LAN obtains state-of-the-art results compared to several competitive approaches.
更新日期:2021-10-05
down
wechat
bug