当前位置: X-MOL 学术J. Franklin Inst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generalization of the gradient method with fractional order gradient direction
Journal of the Franklin Institute ( IF 4.1 ) Pub Date : 2020-01-26 , DOI: 10.1016/j.jfranklin.2020.01.008
Yiheng Wei , Yu Kang , Weidi Yin , Yong Wang

Fractional calculus is an efficient tool, which has the potential to improve the performance of gradient methods. However, when the first order gradient direction is generalized by fractional order gradient one, the corresponding algorithm converges to the fractional extreme point of the target function which is not equal to the real extreme point. This drawback critically hampers the application of this method. To solve such a convergence problem, the current paper analyzes the specific reasons and proposes three possible solutions. Considering the long memory characteristics of fractional derivative, short memory principle is a prior choice. Apart from the truncation of memory length, two new methods are developed to reach the convergence. The former is the truncation of the infinite series, and the latter is the modification of the constant fractional order. Finally, six illustrative examples are performed to illustrate the effectiveness and practicability of proposed methods.



中文翻译:

分数阶梯度方向的梯度法的推广

分数演算是一种有效的工具,它有可能改善梯度方法的性能。但是,当一阶梯度方向由分数阶梯度1概括时,相应的算法会收敛到目标函数的分数极点,该极点与实际极点不相等。这个缺点严重阻碍了该方法的应用。为了解决这种收敛问题,本文分析了具体原因,并提出了三种可能的解决方案。考虑到分数导数的长存储特性,短存储原理是优先选择。除了缩短内存长度外,还开发了两种新方法来达到收敛。前者是无穷级数的截断,后者是常数分数阶的修改。最后,通过六个示例性例子说明了所提出方法的有效性和实用性。

更新日期:2020-03-20
down
wechat
bug