当前位置: X-MOL 学术Appl. Soft Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An efficient Long Short-Term Memory model based on Laplacian Eigenmap in artificial neural networks
Applied Soft Computing ( IF 7.2 ) Pub Date : 2020-03-18 , DOI: 10.1016/j.asoc.2020.106218
Fang Hu , Yanhui Zhu , Jia Liu , Liuhuan Li

A new algorithm for data prediction based on the Laplacian Eigenmap (LE) is presented. We construct the Long Short-Term Memory model with the application of the LE in artificial neural networks. The new Long Short-Term Memory model based on Laplacian Eigenmap (LE-LSTM) reserves the characteristics of original data using the eigenvectors derived from the Laplacian matrix of the data matrix. LE-LSTM introduces the projection layer embedding data into a lower dimension space so that it improves the efficiency. With the implementation of LE, LE-LSTM provides higher accuracy and less running time on various simulated data sets with characteristics of multivariate, sequential, and time-series. In comparison with previously reported algorithms such as stochastic gradient descent and artificial neural network with three layers, LE-LSTM leads to many more successful runs and learns much faster. The algorithm provides a computationally efficient approach to most of the artificial neural network data sets.



中文翻译:

人工神经网络中基于拉普拉斯特征图的有效长期短期记忆模型

提出了一种基于拉普拉斯特征图的数据预测新算法。我们利用LE在人工神经网络中的应用,构建了长短期记忆模型。基于拉普拉斯特征图(LE-LSTM)的新的长期短期记忆模型使用从数据矩阵的拉普拉斯矩阵得出的特征向量保留原始数据的特征。LE-LSTM引入了将数据嵌入到较低维度空间中的投影层,从而提高了效率。通过实施LE,LE-LSTM在具有多变量,顺序和时间序列特征的各种模拟数据集上提供了更高的准确性和更少的运行时间。与先前报道的算法(例如随机梯度下降和三层人工神经网络)相比,LE-LSTM带来了更多成功的运行并且学习更快。该算法为大多数人工神经网络数据集提供了一种计算有效的方法。

更新日期:2020-03-18
down
wechat
bug