当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
New optimization algorithms for neural network training using operator splitting techniques.
Neural Networks ( IF 6.0 ) Pub Date : 2020-03-26 , DOI: 10.1016/j.neunet.2020.03.018
Cristian Daniel Alecsa 1 , Titus Pinţa 2 , Imre Boros 3
Affiliation  

In the following paper we present a new type of optimization algorithms adapted for neural network training. These algorithms are based upon sequential operator splitting technique for some associated dynamical systems. Furthermore, we investigate through numerical simulations the empirical rate of convergence of these iterative schemes toward a local minimum of the loss function, with some suitable choices of the underlying hyper-parameters. We validate the convergence of these optimizers using the results of the accuracy and of the loss function on the MNIST, MNIST-Fashion and CIFAR 10 classification datasets.

中文翻译:

使用运算符拆分技术进行神经网络训练的新优化算法。

在接下来的论文中,我们提出了一种适用于神经网络训练的新型优化算法。这些算法基于一些相关动力学系统的顺序算子拆分技术。此外,我们通过数值模拟研究了这些迭代方案朝着损失函数的局部最小值收敛的经验率,并选择了一些合适的基础超参数。我们使用MNIST,MNIST-Fashion和CIFAR 10分类数据集的准确性和损失函数的结果验证这些优化器的收敛性。
更新日期:2020-03-27
down
wechat
bug