当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Can Transfer Neuroevolution Tractably Solve Your Differential Equations?
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2021-01-06 , DOI: arxiv-2101.01998
Jian Cheng Wong, Abhishek Gupta, Yew-Soon Ong

This paper introduces neuroevolution for solving differential equations. The solution is obtained through optimizing a deep neural network whose loss function is defined by the residual terms from the differential equations. Recent studies have focused on learning such physics-informed neural networks through stochastic gradient descent (SGD) variants, yet they face the difficulty of obtaining an accurate solution due to optimization challenges. In the context of solving differential equations, we are faced with the problem of finding globally optimum parameters of the network, instead of being concerned with out-of-sample generalization. SGD, which searches along a single gradient direction, is prone to become trapped in local optima, so it may not be the best approach here. In contrast, neuroevolution carries out a parallel exploration of diverse solutions with the goal of circumventing local optima. It could potentially find more accurate solutions with better optimized neural networks. However, neuroevolution can be slow, raising tractability issues in practice. With that in mind, a novel and computationally efficient transfer neuroevolution algorithm is proposed in this paper. Our method is capable of exploiting relevant experiential priors when solving a new problem, with adaptation to protect against the risk of negative transfer. The algorithm is applied on a variety of differential equations to empirically demonstrate that transfer neuroevolution can indeed achieve better accuracy and faster convergence than SGD. The experimental outcomes thus establish transfer neuroevolution as a noteworthy approach for solving differential equations, one that has never been studied in the past. Our work expands the resource of available algorithms for optimizing physics-informed neural networks.

中文翻译:

传递神经进化能切实解决您的微分方程吗?

本文介绍了用于求解微分方程的神经进化。该解决方案是通过优化深度神经网络获得的,该网络的损失函数由微分方程的残差项定义。最近的研究集中在通过随机梯度下降(SGD)变体学习此类具有物理信息的神经网络,但是由于优化挑战,它们面临着获得精确解的困难。在求解微分方程的情况下,我们面临的问题是寻找网络的全局最优参数,而不是关注样本外的泛化。在单个梯度方向上搜索的SGD容易陷入局部最优中,因此在这里并不是最佳方法。相反,神经进化对各种解决方案进行了并行探索,目的是规避局部最优。通过更好地优化神经网络,可以潜在地找到更准确的解决方案。但是,神经进化可能很慢,在实践中会引起易处理性问题。考虑到这一点,本文提出了一种新颖且计算效率高的转移神经进化算法。当解决新问题时,我们的方法能够利用相关的经验先验,并进行调整以防止出现负转移风险。该算法应用于各种微分方程,以经验证明与SGD相比,转移神经进化确实可以实现更好的准确性和更快的收敛。因此,实验结果确立了转移神经进化作为解决微分方程的一种值得注意的方法,这种方法过去从未被研究过。我们的工作扩展了可用于优化物理信息神经网络的可用算法的资源。
更新日期:2021-01-07
down
wechat
bug