当前位置: X-MOL 学术Comput. Optim. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Riemannian conjugate gradient methods with inverse retraction
Computational Optimization and Applications ( IF 1.6 ) Pub Date : 2020-08-17 , DOI: 10.1007/s10589-020-00219-6
Xiaojing Zhu , Hiroyuki Sato

We propose a new class of Riemannian conjugate gradient (CG) methods, in which inverse retraction is used instead of vector transport for search direction construction. In existing methods, differentiated retraction is often used for vector transport to move the previous search direction to the current tangent space. However, a different perspective is adopted here, motivated by the fact that inverse retraction directly measures the displacement from the current to the previous points in terms of tangent vectors at the current point. The proposed algorithm is implemented with the Fletcher–Reeves and the Dai–Yuan formulae, respectively, and global convergence is established using modifications of the Riemannian Wolfe conditions. Computational details of the practical inverse retractions over the Stiefel and fixed-rank manifolds are discussed. Numerical results obtained for the Brockett cost function minimization problem, the joint diagonalization problem, and the low-rank matrix completion problem demonstrate the potential effectiveness of Riemannian CG with inverse retraction.



中文翻译:

逆伸缩的黎曼共轭梯度法

我们提出了一类新的黎曼共轭梯度(CG)方法,其中使用逆向收缩代替向量传输来构建搜索方向。在现有方法中,微分收缩通常用于矢量传输,以将先前的搜索方向移至当前切线空间。但是,这里采用的是不同的观点,这是由于反向回缩直接根据当前点的切向量测量从当前点到先前点的位移。所提出的算法分别用Fletcher-Reeves和Dai-Yuan公式实现,并通过修改Riemannian Wolfe条件来建立全局收敛。讨论了Stiefel和固定级歧管上实际逆向回缩的计算细节。

更新日期:2020-08-18
down
wechat
bug