Abstract
This paper considers sufficient descent Riemannian conjugate gradient methods with line search algorithms. We propose two kinds of sufficient descent nonlinear conjugate gradient method and prove that these methods satisfy the sufficient descent condition on Riemannian manifolds. One is a hybrid method combining a Fletcher–Reeves-type method with a Polak–Ribière–Polyak-type method, and the other is a Hager–Zhang-type method, both of which are generalizations of those used in Euclidean space. Moreover, we prove that the hybrid method has a global convergence property under the strong Wolfe conditions and the Hager–Zhang-type method has the sufficient descent property regardless of whether a line search is used or not. Further, we review two kinds of line search algorithm on Riemannian manifolds and numerically compare our generalized methods by solving several Riemannian optimization problems. The results show that the performance of the proposed hybrid methods greatly depends on the type of line search used. Meanwhile, the Hager–Zhang-type method has the fast convergence property regardless of the type of line search used.
Similar content being viewed by others
Notes
References
Absil, P.A., Gallivan, K.A.: Joint diagonalization on the oblique manifold for independent component analysis. In: 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, vol. 5 (2006)
Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)
Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dai, Y.H., Yuan, Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103(1–4), 33–47 (2001)
Dai, Y.H.: Nonlinear Conjugate Gradient Methods. Wiley Encyclopedia of Operations Research and Management Science (2010)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program 91(2), 201–213 (2002)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of Conjugate Gradients for Solving Linear Systems. NBS, Washington, DC (1952)
Hosseini, S., Huang, W., Yousefpour, R.: Line search algorithms for locally Lipschitz functions on Riemannian manifolds. SIAM J. Optim. 28(1), 596–619 (2018)
Hu, Y., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)
Narushima, Y., Yabe, H.: A survey of sufficient descent conjugate gradient methods for unconstrained optimization. SUT J. Math. 50(2), 167–203 (2014)
Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)
Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Esaim Math. Model Numer. Anal. 3(R1), 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)
Sakai, T.: Riemannian Geometry, vol. 149, American Mathematical Soc (1996)
Sakai, H., Iiduka, H.: Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput. Optim. Appl. 77, 811–830 (2020)
Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)
Sato, H.: A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64(1), 101–118 (2016)
Smith, S.T.: Optimization techniques on Riemannian manifolds. Fields Inst. Commun. 3(3), 113–135 (1994)
Townsend, J., Koep, N., Weichwald, S.: Pymanopt: a python toolbox for optimization on manifolds using automatic differentiation. J. Mach. Learn. Res. 17(1), 4755–4759 (2016)
Vandereycken, B.: Low-rank matrix completion by Riemannian optimization. SIAM J. Optim. 23(2), 1214–1236 (2013)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. Soc. Ind. Appl. Math. 11(2), 226–235 (1969)
Wolfe, P.: Convergence conditions for ascent methods, ii: some corrections. SIAM Rev. Soc. Ind. Appl. Math. 13(2), 185–188 (1971)
Zhu, X.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67(1), 73–110 (2017)
Acknowledgements
We are sincerely grateful to the Editor-in-Chief, the anonymous associate editor, and the two anonymous reviewers for helping us improve the original manuscript. This work was supported by a JSPS KAKENHI Grant, Number JP18K11184.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Sándor Zoltán Németh.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sakai, H., Iiduka, H. Sufficient Descent Riemannian Conjugate Gradient Methods. J Optim Theory Appl 190, 130–150 (2021). https://doi.org/10.1007/s10957-021-01874-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-021-01874-3
Keywords
- Riemannian conjugate gradient method
- Sufficient descent condition
- Strong Wolfe conditions
- Line search algorithm