当前位置: X-MOL 学术Numer. Linear Algebra Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Nesterov acceleration of alternating least squares for canonical tensor decomposition: Momentum step size selection and restart mechanisms
Numerical Linear Algebra with Applications ( IF 4.3 ) Pub Date : 2020-04-17 , DOI: 10.1002/nla.2297
Drew Mitchell 1 , Nan Ye 2 , Hans De Sterck 3
Affiliation  

We present Nesterov‐type acceleration techniques for alternating least squares (ALS) methods applied to canonical tensor decomposition. While Nesterov acceleration turns gradient descent into an optimal first‐order method for convex problems by adding a momentum term with a specific weight sequence, a direct application of this method and weight sequence to ALS results in erratic convergence behavior. This is so because ALS is accelerated instead of gradient descent for our nonconvex problem. Instead, we consider various restart mechanisms and suitable choices of momentum weights that enable effective acceleration. Our extensive empirical results show that the Nesterov‐accelerated ALS methods with restart can be dramatically more efficient than the stand‐alone ALS or Nesterov's accelerated gradient methods, when problems are ill‐conditioned or accurate solutions are desired. The resulting methods perform competitively with or superior to existing acceleration methods for ALS, including ALS acceleration by nonlinear conjugate gradient, nonlinear generalized minimal residual method, or limited‐memory Broyden‐Fletcher‐Goldfarb‐Shanno, and additionally enjoy the benefit of being much easier to implement. We also compare with Nesterov‐type updates where the momentum weight is determined by a line search (LS), which are equivalent or closely related to existing LS methods for ALS. On a large and ill‐conditioned 71×1,000×900 tensor consisting of readings from chemical sensors to track hazardous gases, the restarted Nesterov‐ALS method shows desirable robustness properties and outperforms any of the existing methods we compare with by a large factor. There is clear potential for extending our Nesterov‐type acceleration approach to accelerating other optimization algorithms than ALS applied to other nonconvex problems, such as Tucker tensor decomposition.

中文翻译:

规范张量分解的交替最小二乘的Nesterov加速:动量步长选择和重启机制

我们介绍了适用于规范张量分解的交替最小二乘(ALS)方法的Nesterov型加速技术。虽然Nesterov加速度通过添加具有特定权重序列的动量项将梯度下降法转化为凸问题的最佳一阶方法,但是将此方法和权重序列直接应用于ALS会导致不稳定的收敛行为。之所以如此,是因为对于我们的非凸问题,ALS被加速而不是梯度下降。相反,我们认为各种重启机制和动力的权重,使有效的加速合适的选择。我们广泛的经验结果表明,重新启动的Nesterov加速ALS方法比独立ALS或Nesterov的加速梯度方法要有效得多,当问题陷入困境或需要准确的解决方案时。所产生的方法与现有的ALS加速方法相比具有竞争优势或优于现有的ALS加速方法,包括通过非线性共轭梯度的ALS加速,非线性广义最小残差法或有限内存的Broyden-Fletcher-Goldfarb-Shanno,并且还具有更轻松的优势。实施。我们还与Nesterov型更新进行了比较,后者的动量权重由线搜索(LS)确定,该搜索与ALS的现有LS方法等效或紧密相关。在一个由化学传感器读取的,用于跟踪有害气体的,大的,病态的71×1,000×900张量中,重新启动的Nesterov-ALS方法显示出理想的鲁棒性,并且在很大程度上要优于我们所比较的任何现有方法。
更新日期:2020-04-17
down
wechat
bug