Skip to main content
Log in

Sufficient Descent Riemannian Conjugate Gradient Methods

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

This paper considers sufficient descent Riemannian conjugate gradient methods with line search algorithms. We propose two kinds of sufficient descent nonlinear conjugate gradient method and prove that these methods satisfy the sufficient descent condition on Riemannian manifolds. One is a hybrid method combining a Fletcher–Reeves-type method with a Polak–Ribière–Polyak-type method, and the other is a Hager–Zhang-type method, both of which are generalizations of those used in Euclidean space. Moreover, we prove that the hybrid method has a global convergence property under the strong Wolfe conditions and the Hager–Zhang-type method has the sufficient descent property regardless of whether a line search is used or not. Further, we review two kinds of line search algorithm on Riemannian manifolds and numerically compare our generalized methods by solving several Riemannian optimization problems. The results show that the performance of the proposed hybrid methods greatly depends on the type of line search used. Meanwhile, the Hager–Zhang-type method has the fast convergence property regardless of the type of line search used.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. The formulas defined by (10) and (14) satisfy \(\left| {\beta _{k+1}.}\right| \le \beta _{k+1}^\mathrm {FR}\).

  2. https://www.pymanopt.org/.

  3. https://docs.scipy.org/doc/scipy/reference/.

References

  1. Absil, P.A., Gallivan, K.A.: Joint diagonalization on the oblique manifold for independent component analysis. In: 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, vol. 5 (2006)

  2. Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    Book  Google Scholar 

  3. Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    Article  MathSciNet  Google Scholar 

  4. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  5. Dai, Y.H., Yuan, Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103(1–4), 33–47 (2001)

    Article  MathSciNet  Google Scholar 

  6. Dai, Y.H.: Nonlinear Conjugate Gradient Methods. Wiley Encyclopedia of Operations Research and Management Science (2010)

  7. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program 91(2), 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  8. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  9. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  10. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  11. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  12. Hestenes, M.R., Stiefel, E.: Methods of Conjugate Gradients for Solving Linear Systems. NBS, Washington, DC (1952)

    Book  Google Scholar 

  13. Hosseini, S., Huang, W., Yousefpour, R.: Line search algorithms for locally Lipschitz functions on Riemannian manifolds. SIAM J. Optim. 28(1), 596–619 (2018)

    Article  MathSciNet  Google Scholar 

  14. Hu, Y., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)

    Article  MathSciNet  Google Scholar 

  15. Narushima, Y., Yabe, H.: A survey of sufficient descent conjugate gradient methods for unconstrained optimization. SUT J. Math. 50(2), 167–203 (2014)

    MathSciNet  MATH  Google Scholar 

  16. Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)

    MATH  Google Scholar 

  17. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Esaim Math. Model Numer. Anal. 3(R1), 35–43 (1969)

    MATH  Google Scholar 

  18. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  Google Scholar 

  19. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)

    Article  MathSciNet  Google Scholar 

  20. Sakai, T.: Riemannian Geometry, vol. 149, American Mathematical Soc (1996)

  21. Sakai, H., Iiduka, H.: Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput. Optim. Appl. 77, 811–830 (2020)

  22. Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)

    Article  MathSciNet  Google Scholar 

  23. Sato, H.: A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64(1), 101–118 (2016)

    Article  MathSciNet  Google Scholar 

  24. Smith, S.T.: Optimization techniques on Riemannian manifolds. Fields Inst. Commun. 3(3), 113–135 (1994)

    MathSciNet  MATH  Google Scholar 

  25. Townsend, J., Koep, N., Weichwald, S.: Pymanopt: a python toolbox for optimization on manifolds using automatic differentiation. J. Mach. Learn. Res. 17(1), 4755–4759 (2016)

    MathSciNet  MATH  Google Scholar 

  26. Vandereycken, B.: Low-rank matrix completion by Riemannian optimization. SIAM J. Optim. 23(2), 1214–1236 (2013)

    Article  MathSciNet  Google Scholar 

  27. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. Soc. Ind. Appl. Math. 11(2), 226–235 (1969)

    MathSciNet  MATH  Google Scholar 

  28. Wolfe, P.: Convergence conditions for ascent methods, ii: some corrections. SIAM Rev. Soc. Ind. Appl. Math. 13(2), 185–188 (1971)

    MathSciNet  MATH  Google Scholar 

  29. Zhu, X.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67(1), 73–110 (2017)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We are sincerely grateful to the Editor-in-Chief, the anonymous associate editor, and the two anonymous reviewers for helping us improve the original manuscript. This work was supported by a JSPS KAKENHI Grant, Number JP18K11184.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hiroyuki Sakai.

Additional information

Communicated by Sándor Zoltán Németh.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sakai, H., Iiduka, H. Sufficient Descent Riemannian Conjugate Gradient Methods. J Optim Theory Appl 190, 130–150 (2021). https://doi.org/10.1007/s10957-021-01874-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-021-01874-3

Keywords

Mathematics Subject Classification

Navigation