当前位置:
X-MOL 学术
›
arXiv.cs.IT
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Momentum Accelerates Evolutionary Dynamics
arXiv - CS - Information Theory Pub Date : 2020-07-05 , DOI: arxiv-2007.02449 Marc Harper and Joshua Safyan
arXiv - CS - Information Theory Pub Date : 2020-07-05 , DOI: arxiv-2007.02449 Marc Harper and Joshua Safyan
We combine momentum from machine learning with evolutionary dynamics, where
momentum can be viewed as a simple mechanism of intergenerational memory. Using
information divergences as Lyapunov functions, we show that momentum
accelerates the convergence of evolutionary dynamics including the replicator
equation and Euclidean gradient descent on populations. When evolutionarily
stable states are present, these methods prove convergence for small learning
rates or small momentum, and yield an analytic determination of the relative
decrease in time to converge that agrees well with computations. The main
results apply even when the evolutionary dynamic is not a gradient flow. We
also show that momentum can alter the convergence properties of these dynamics,
for example by breaking the cycling associated to the rock-paper-scissors
landscape, leading to either convergence to the ordinarily non-absorbing
equilibrium, or divergence, depending on the value and mechanism of momentum.
中文翻译:
动量加速进化动力学
我们将机器学习的动量与进化动力学相结合,其中动量可以被视为一种简单的代际记忆机制。使用信息发散作为李雅普诺夫函数,我们表明动量加速了进化动力学的收敛,包括复制方程和种群的欧几里德梯度下降。当存在进化稳定状态时,这些方法证明了小学习率或小动量的收敛性,并产生了收敛时间的相对减少的分析确定,这与计算非常吻合。即使进化动力学不是梯度流,主要结果也适用。我们还表明,动量可以改变这些动力学的收敛特性,例如通过打破与石头剪刀布景观相关的循环,
更新日期:2020-07-07
中文翻译:
动量加速进化动力学
我们将机器学习的动量与进化动力学相结合,其中动量可以被视为一种简单的代际记忆机制。使用信息发散作为李雅普诺夫函数,我们表明动量加速了进化动力学的收敛,包括复制方程和种群的欧几里德梯度下降。当存在进化稳定状态时,这些方法证明了小学习率或小动量的收敛性,并产生了收敛时间的相对减少的分析确定,这与计算非常吻合。即使进化动力学不是梯度流,主要结果也适用。我们还表明,动量可以改变这些动力学的收敛特性,例如通过打破与石头剪刀布景观相关的循环,