当前位置: X-MOL 学术Optim. Methods Softw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On a multilevel Levenberg–Marquardt method for the training of artificial neural networks and its application to the solution of partial differential equations
Optimization Methods & Software ( IF 2.2 ) Pub Date : 2020-06-22 , DOI: 10.1080/10556788.2020.1775828
H. Calandra 1 , S. Gratton 2 , E. Riccietti 2 , X. Vasseur 3
Affiliation  

In this paper, we propose a new multilevel Levenberg–Marquardt optimizer for the training of artificial neural networks with quadratic loss function. This setting allows us to get further insight into the potential of multilevel optimization methods. Indeed, when the least squares problem arises from the training of artificial neural networks, the variables subject to optimization are not related by any geometrical constraints and the standard interpolation and restriction operators cannot be employed any longer. A heuristic, inspired by algebraic multigrid methods, is then proposed to construct the multilevel transfer operators. We test the new optimizer on an important application: the approximate solution of partial differential equations by means of artificial neural networks. The learning problem is formulated as a least squares problem, choosing the nonlinear residual of the equation as a loss function, whereas the multilevel method is employed as a training method. Numerical experiments show encouraging results related to the efficiency of the new multilevel optimization method compared to the corresponding one-level procedure in this context.



中文翻译:

用于训练人工神经网络的多级 Levenberg-Marquardt 方法及其在求解偏微分方程中的应用

在本文中,我们提出了一种新的多级 Levenberg-Marquardt 优化器,用于训练具有二次损失函数的人工神经网络。此设置使我们能够进一步了解多级优化方法的潜力。事实上,当人工神经网络的训练产生最小二乘问题时,受优化的变量与任何几何约束无关,并且不能再使用标准的插值和约束算子。然后提出了一种受代数多重网格方法启发的启发式方法来构造多级传递算子。我们在一个重要应用上测试了新的优化器:通过人工神经网络的偏微分方程的近似解。学习问题被表述为最小二乘问题,选择方程的非线性残差作为损失函数,而采用多级方法作为训练方法。数值实验表明,与这种情况下相应的一级程序相比,新的多级优化方法的效率令人鼓舞。

更新日期:2020-06-22
down
wechat
bug