当前位置: X-MOL 学术Inform. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Average convergence rate of evolutionary algorithms in continuous optimization
Information Sciences Pub Date : 2021-02-09 , DOI: 10.1016/j.ins.2020.12.076
Yu Chen , Jun He

The average convergence rate (ACR) measures how fast the approximation error of an evolutionary algorithm converges to zero per generation. It is defined as the geometric average of the reduction rate of the approximation error over consecutive generations. This paper makes a theoretical analysis of the ACR in continuous optimization. The obtained results are summarized as follows. According to the limit property, the ACR is classified into two categories: (1) linear ACR whose limit inferior value is larger than a positive and (2) sublinear ACR whose value converges to zero. Then, it is proven that the ACR is linear for evolutionary programming using positive landscape-adaptive mutation, but sublinear for that using landscape-invariant or zero landscape-adaptive mutation. The relationship between the ACR and the decision space dimension is also classified into two categories: (1) polynomial ACR whose value is larger than the reciprocal of a polynomial function of the dimension for any generation, and (2) exponential ACR whose value is less than the reciprocal of an exponential function of the dimension for an exponential long period. It is proven that for easy problems such as linear functions, the ACR of the (1 + 1) adaptive random univariate search is polynomial. But for hard functions such as the deceptive function, the ACR of both the (1 + 1) adaptive random univariate search and evolutionary programming is exponential.



中文翻译:

连续优化中进化算法的平均收敛速度

平均收敛率(ACR)衡量进化算法的近似误差每代收敛到零的速度。它定义为连续几代中逼近误差降低率的几何平均值。本文对连续优化中的ACR进行了理论分析。获得的结果总结如下。根据极限特性,ACR分为两类:(1)极限劣质的线性ACR值大于正值和(2)亚线性ACR,其值收敛到零。然后,证明了对于使用正的景观自适应突变的进化规划来说,ACR是线性的,而对于使用景观不变或零景观自适应突变的则是亚线性的。ACR和决策空间维之间的关系也分为两类:(1)对于任何代,其值都大于该维的多项式函数的倒数的多项式ACR;以及(2)其值较小的指数ACR大于指数的指数函数在较长的指数周期内的倒数。事实证明,对于诸如线性函数之类的简单问题,(1 +1)个自适应随机单变量搜索的ACR是多项式。但是对于诸如欺骗功能之类的硬功能,(1 + 1)自适应随机单变量搜索和进化规划的ACR都是指数级的。

更新日期:2021-03-01
down
wechat
bug