当前位置:
X-MOL 学术
›
arXiv.cs.NA
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
On the Asymptotic Linear Convergence Speed of Anderson Acceleration, Nesterov Acceleration, and Nonlinear GMRES
arXiv - CS - Numerical Analysis Pub Date : 2020-07-04 , DOI: arxiv-2007.01996 Hans De Sterck and Yunhui He
arXiv - CS - Numerical Analysis Pub Date : 2020-07-04 , DOI: arxiv-2007.01996 Hans De Sterck and Yunhui He
We consider nonlinear convergence acceleration methods for fixed-point
iteration $x_{k+1}=q(x_k)$, including Anderson acceleration (AA), nonlinear
GMRES (NGMRES), and Nesterov-type acceleration (corresponding to AA with window
size one). We focus on fixed-point methods that converge asymptotically
linearly with convergence factor $\rho<1$ and that solve an underlying fully
smooth and non-convex optimization problem. It is often observed that AA and
NGMRES substantially improve the asymptotic convergence behavior of the
fixed-point iteration, but this improvement has not been quantified
theoretically. We investigate this problem under simplified conditions. First,
we consider stationary versions of AA and NGMRES, and determine coefficients
that result in optimal asymptotic convergence factors, given knowledge of the
spectrum of $q'(x)$ at the fixed point $x^*$. This allows us to understand and
quantify the asymptotic convergence improvement that can be provided by
nonlinear convergence acceleration, viewing $x_{k+1}=q(x_k)$ as a nonlinear
preconditioner for AA and NGMRES. Second, for the case of infinite window size,
we consider linear asymptotic convergence bounds for GMRES applied to the
fixed-point iteration linearized about $x^*$. Since AA and NGMRES are
equivalent to GMRES in the linear case, one may expect the GMRES convergence
factors to be relevant for AA and NGMRES as $x_k \rightarrow x^*$. Our results
are illustrated numerically for a class of test problems from canonical tensor
decomposition, comparing steepest descent and alternating least squares (ALS)
as the fixed-point iterations that are accelerated by AA and NGMRES. Our
numerical tests show that both approaches allow us to estimate asymptotic
convergence speed for nonstationary AA and NGMRES with finite window size.
中文翻译:
关于安德森加速、内斯特罗夫加速和非线性 GMRES 的渐近线性收敛速度
我们考虑定点迭代的非线性收敛加速方法$x_{k+1}=q(x_k)$,包括Anderson加速(AA)、非线性GMRES(NGMRES)和Nesterov型加速(对应AA与窗口大小一)。我们专注于定点方法,它们以收敛因子 $\rho<1$ 渐近线性收敛,并解决潜在的完全平滑和非凸优化问题。经常观察到 AA 和 NGMRES 显着改善了定点迭代的渐近收敛行为,但这种改善尚未在理论上量化。我们在简化条件下研究这个问题。首先,我们考虑 AA 和 NGMRES 的平稳版本,并确定导致最优渐近收敛因子的系数,给定 $q' 的频谱知识 (x)$ 在固定点 $x^*$。这使我们能够理解和量化非线性收敛加速可以提供的渐近收敛改进,将 $x_{k+1}=q(x_k)$ 视为 AA 和 NGMRES 的非线性预处理器。其次,对于无限窗口大小的情况,我们考虑将 GMRES 的线性渐近收敛边界应用于关于 $x^*$ 线性化的定点迭代。由于 AA 和 NGMRES 在线性情况下等效于 GMRES,因此可以预期 GMRES 收敛因子与 AA 和 NGMRES 相关为 $x_k \rightarrow x^*$。我们的结果用数字说明了一类来自规范张量分解的测试问题,比较了最速下降和交替最小二乘法 (ALS) 作为由 AA 和 NGMRES 加速的定点迭代。
更新日期:2020-11-10
中文翻译:
关于安德森加速、内斯特罗夫加速和非线性 GMRES 的渐近线性收敛速度
我们考虑定点迭代的非线性收敛加速方法$x_{k+1}=q(x_k)$,包括Anderson加速(AA)、非线性GMRES(NGMRES)和Nesterov型加速(对应AA与窗口大小一)。我们专注于定点方法,它们以收敛因子 $\rho<1$ 渐近线性收敛,并解决潜在的完全平滑和非凸优化问题。经常观察到 AA 和 NGMRES 显着改善了定点迭代的渐近收敛行为,但这种改善尚未在理论上量化。我们在简化条件下研究这个问题。首先,我们考虑 AA 和 NGMRES 的平稳版本,并确定导致最优渐近收敛因子的系数,给定 $q' 的频谱知识 (x)$ 在固定点 $x^*$。这使我们能够理解和量化非线性收敛加速可以提供的渐近收敛改进,将 $x_{k+1}=q(x_k)$ 视为 AA 和 NGMRES 的非线性预处理器。其次,对于无限窗口大小的情况,我们考虑将 GMRES 的线性渐近收敛边界应用于关于 $x^*$ 线性化的定点迭代。由于 AA 和 NGMRES 在线性情况下等效于 GMRES,因此可以预期 GMRES 收敛因子与 AA 和 NGMRES 相关为 $x_k \rightarrow x^*$。我们的结果用数字说明了一类来自规范张量分解的测试问题,比较了最速下降和交替最小二乘法 (ALS) 作为由 AA 和 NGMRES 加速的定点迭代。