当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hierarchical-Task Reservoir for Online Semantic Analysis From Continuous Speech
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.4 ) Pub Date : 2021-09-27 , DOI: 10.1109/tnnls.2021.3095140
Luca Pedrelli 1 , Xavier Hinaut 1
Affiliation  

In this article, we propose a novel architecture called hierarchical-task reservoir (HTR) suitable for real-time applications for which different levels of abstraction are available. We apply it to semantic role labeling (SRL) based on continuous speech recognition. Taking inspiration from the brain, this demonstrates the hierarchies of representations from perceptive to integrative areas, and we consider a hierarchy of four subtasks with increasing levels of abstraction (phone, word, part-of-speech (POS), and semantic role tags). These tasks are progressively learned by the layers of the HTR architecture. Interestingly, quantitative and qualitative results show that the hierarchical-task approach provides an advantage to improve the prediction. In particular, the qualitative results show that a shallow or a hierarchical reservoir, considered as baselines, does not produce estimations as good as the HTR model would. Moreover, we show that it is possible to further improve the accuracy of the model by designing skip connections and by considering word embedding (WE) in the internal representations. Overall, the HTR outperformed the other state-of-the-art reservoir-based approaches and it resulted in extremely efficient with respect to typical recurrent neural networks (RNNs) in deep learning (DL) [e.g., long short term memory (LSTMs)]. The HTR architecture is proposed as a step toward the modeling of online and hierarchical processes at work in the brain during language comprehension.

中文翻译:

连续语音在线语义分析的分层任务库

在本文中,我们提出了一种称为分层任务库(HTR)的新颖架构,适用于具有不同抽象级别的实时应用程序。我们将其应用于基于连续语音识别的语义角色标签(SRL)。从大脑中汲取灵感,这展示了从感知区域到综合区域的表示层次结构,我们考虑了四个子任务的层次结构,抽象级别越来越高(电话、单词、词性 (POS) 和语义角色标签) . 这些任务由 HTR 架构的各个层逐步学习。有趣的是,定量和定性结果表明,分层任务方法为改进预测提供了优势。特别是,定性结果表明,浅层或分层储层,被视为基线,不会产生像 HTR 模型那样好的估计。此外,我们表明,通过设计跳跃连接和在内部表示中考虑词嵌入(WE),可以进一步提高模型的准确性。总体而言,HTR 优于其他最先进的基于存储库的方法,并且它在深度学习 (DL) 中的典型循环神经网络 (RNN) [例如,长期短期记忆 (LSTM) ]。HTR 架构被提议作为在语言理解期间对大脑中工作的在线和分层过程进行建模的一步。我们表明,通过设计跳跃连接和在内部表示中考虑词嵌入(WE),可以进一步提高模型的准确性。总体而言,HTR 优于其他最先进的基于存储库的方法,并且它在深度学习 (DL) 中的典型循环神经网络 (RNN) [例如,长期短期记忆 (LSTM) ]。HTR 架构被提议作为在语言理解期间对大脑中工作的在线和分层过程进行建模的一步。我们表明,通过设计跳跃连接和在内部表示中考虑词嵌入(WE),可以进一步提高模型的准确性。总体而言,HTR 优于其他最先进的基于存储库的方法,并且它在深度学习 (DL) 中的典型循环神经网络 (RNN) [例如,长期短期记忆 (LSTM) ]。HTR 架构被提议作为在语言理解期间对大脑中工作的在线和分层过程进行建模的一步。长短期记忆(LSTM)]。HTR 架构被提议作为在语言理解期间对大脑中工作的在线和分层过程进行建模的一步。长短期记忆(LSTM)]。HTR 架构被提议作为在语言理解期间对大脑中工作的在线和分层过程进行建模的一步。
更新日期:2021-09-27
down
wechat
bug