当前位置: X-MOL 学术J. Appl. Math. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A limited memory q -BFGS algorithm for unconstrained optimization problems
Journal of Applied Mathematics and Computing ( IF 2.4 ) Pub Date : 2020-09-08 , DOI: 10.1007/s12190-020-01432-6
Kin Keung Lai , Shashi Kant Mishra , Geetanjali Panda , Suvra Kanti Chakraborty , Mohammad Esmael Samei , Bhagwat Ram

A limited memory q-BFGS (Broyden–Fletcher–Goldfarb–Shanno) method is presented for solving unconstrained optimization problems. It is derived from a modified BFGS-type update using q-derivative (quantum derivative). The use of Jackson’s derivative is an effective mechanism for escaping from local minima. The q-gradient method is complemented to generate the parameter q for computing the step length in such a way that the search process gradually shifts from global in the beginning to almost local search in the end. Further, the global convergence is established under Armijo-Wolfe conditions even if the objective function is not convex. The numerical experiments show that proposed method is potentially efficient.



中文翻译:

用于无约束优化问题的有限内存q -BFGS算法

提出了有限内存q -BFGS(Broyden-Fletcher-Goldfarb-Shanno)方法,用于解决无约束的优化问题。它是使用q导数(量子导数)从修改后的BFGS类型更新得出的。使用杰克逊导数是逃避局部极小值的有效机制。该q -gradient方法进行补充,以生成所述参数 q用于计算以这样的方式,步长,搜索过程从全球逐渐转变在开始到底几乎局部搜索。此外,即使目标函数不是凸的,在Armijo-Wolfe条件下也会建立全局收敛。数值实验表明,该方法是有效的。

更新日期:2020-09-08
down
wechat
bug