当前位置: X-MOL 学术arXiv.cs.CC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Derivative-free global minimization for a class of multiple minima problems
arXiv - CS - Computational Complexity Pub Date : 2020-06-15 , DOI: arxiv-2006.08181
Xiaopeng Luo and Xin Xu and Daoyi Dong

We prove that the finite-difference based derivative-free descent (FD-DFD) methods have a capability to find the global minima for a class of multiple minima problems. Our main result shows that, for a class of multiple minima objectives that is extended from strongly convex functions with Lipschitz-continuous gradients, the iterates of FD-DFD converge to the global minimizer $x_*$ with the linear convergence $\|x_{k+1}-x_*\|_2^2\leqslant\rho^k \|x_1-x_*\|_2^2$ for a fixed $0<\rho<1$ and any initial iteration $x_1\in\mathbb{R}^d$ when the parameters are properly selected. Since the per-iteration cost, i.e., the number of function evaluations, is fixed and almost independent of the dimension $d$, the FD-DFD algorithm has a complexity bound $\mathcal{O}(\log\frac{1}{\epsilon})$ for finding a point $x$ such that the optimality gap $\|x-x_*\|_2^2$ is less than $\epsilon>0$. Numerical experiments in various dimensions from $5$ to $500$ demonstrate the benefits of the FD-DFD method.

中文翻译:

一类多极小值问题的无导数全局极小化

我们证明了基于有限差分的无导数下降 (FD-DFD) 方法能够为一类多极小值问题找到全局极小值。我们的主要结果表明,对于一类从具有 Lipschitz 连续梯度的强凸函数扩展而来的多极小目标,FD-DFD 的迭代收敛到全局极小值 $x_*$,线性收敛 $\|x_{ k+1}-x_*\|_2^2\leqslant\rho^k \|x_1-x_*\|_2^2$ 对于固定的 $0<\rho<1$ 和任何初始迭代 $x_1\in\mathbb {R}^d$ 参数选择正确时。由于每次迭代的成本,即函数评估的次数,是固定的并且几乎与维度 $d$ 无关,FD-DFD 算法有一个复杂性界限 $\mathcal{O}(\log\frac{1}{\epsilon})$ 用于找到一个点 $x$ 使得最优差距 $\|x-x_*\| _2^2$ 小于 $\epsilon>0$。从 5 美元到 500 美元的不同维度的数值实验证明了 FD-DFD 方法的好处。
更新日期:2020-06-26
down
wechat
bug