当前位置: X-MOL 学术Inf. Process. Manag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies
Information Processing & Management ( IF 7.4 ) Pub Date : 2020-08-25 , DOI: 10.1016/j.ipm.2020.102362
Jeena Kleenankandy , Abdul Nazeer K A

Background: Tree-based Long Short Term Memory (LSTM) network has become state-of-the-art for modeling the meaning of language texts as they can effectively exploit the grammatical syntax and thereby non-linear dependencies among words of the sentence. However, most of these models cannot recognize the difference in meaning caused by a change in semantic roles of words or phrases because they do not acknowledge the type of grammatical relations, also known as typed dependencies, in sentence structure.

Methods: This paper proposes an enhanced LSTM architecture, called relation gated LSTM, which can model the relationship between two inputs of a sequence using a control input. We also introduce a Tree-LSTM model called Typed Dependency Tree-LSTM that uses the sentence dependency parse structure as well as the dependency type to embed sentence meaning into a dense vector.

Results: The proposed model outperformed its type-unaware counterpart in two typical Natural Language Processing (NLP) tasks – Semantic Relatedness Scoring and Sentiment Analysis. The results were comparable or competitive with other state-of-the-art models. Qualitative analysis showed that changes in the voice of sentences had little effect on the model’s predicted scores, while changes in nominal (noun) words had a more significant impact. The model recognized subtle semantic relationships in sentence pairs. The magnitudes of learned typed dependency embeddings were also in agreement with human intuitions.

Conclusion: The research findings imply the significance of grammatical relations in sentence modeling. The proposed models would serve as a base for future researches in this direction.



中文翻译:

增强的Tree-LSTM体系结构,用于使用类型相关性进行句子语义建模

背景:基于树的长期短期记忆(LSTM)网络已成为对语言文本的含义进行建模的最新技术,因为它们可以有效利用语法语法,从而使句子单词之间具有非线性依赖性。但是,这些模型中的大多数不能识别由于单词或短语的语义角色变化而引起的含义差异,因为它们无法识别句子结构中的语法关系类型,也称为类型依赖性。

方法:本文提出了一种增强的LSTM体系结构,称为关系门LSTM,它可以使用控制输入对序列的两个输入之间的关系进行建模。我们还介绍了一个称为Typed Dependency Tree-LSTM的Tree-LSTM模型,该模型使用句子依赖项解析结构以及依赖关系类型将句子含义嵌入到密集向量中。

结果:在两个典型的自然语言处理(NLP)任务-语义相关性评分和情感分析中,该模型的性能优于未识别类型的模型。结果与其他最新模型相当或具有竞争力。定性分析表明,句子语音的变化对模型的预测分数影响很小,而名词(名词)词的变化影响更大。该模型识别句子对中的细微语义关系。学习型依赖项嵌入的数量也与人类的直觉相一致。

结论:研究发现暗示了语法关系在句子建模中的重要性。所提出的模型将为该方向的未来研究奠定基础。

更新日期:2020-08-26
down
wechat
bug