Skip to main content
Log in

Another Three-Term Conjugate Gradient Method Close to the Memoryless BFGS for Large-Scale Unconstrained Optimization Problems

  • Published:
Mediterranean Journal of Mathematics Aims and scope Submit manuscript

Abstract

In this paper, we propose a new three term conjugate gradient method, in which the search direction is close to the direction in the memoryless BFGS method. The global convergence of the method is established under a modified Wolfe line search. Results of numerical experiments presented confirm that the three term method is effective and superior to some other conjugate gradient methods if the standard Wolfe line search strategy is used. Furthermore, the method produces a set of even better results when implemented under a modified Wolfe line search.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

  2. Andrei, N.: A double parameter scaled BFGS method for unconstrained optimization. J. Comput. Appl. Math. 332, 26–44 (2018)

    Article  MathSciNet  Google Scholar 

  3. Andrei, N.: A double parameter self-scaling memoryless BFGS method for unconstrained optimization. Comput. Appl. Math. 39, 159 (2020)

    Article  MathSciNet  Google Scholar 

  4. Baluch, B., Salleh, Z., Alhawarat, A.: A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence. J. Optim. 2018, 5057096 (2018)

    MathSciNet  MATH  Google Scholar 

  5. Dai, Y.H., Yuan, Y.: A non-linear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  6. Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  7. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  Google Scholar 

  8. Dehghani, R., Hosseini, M.M., Bidabadi, N.: The modified quasi-Newton methods for solving unconstrained optimization problems. Int. J. Numer. Model. 32, e2459 (2019)

    Article  Google Scholar 

  9. Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Prog. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  10. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradient. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  11. Fetcher, R.: Practical Methods of Optimization vol. 1: Unconstrained Optimization. Wiley, New York (1987)

  12. Gutiérrez, J.M., Hernández-Verón, M.Á.: An acceleration of the continuous Newton’s method. J. Comput. Appl. Math. 354, 213–220 (2019)

    Article  MathSciNet  Google Scholar 

  13. Hager, W.W., Zhang, H.C.: A new conjugate gradient with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  14. Hestenes, M., Steifel, E.: Method of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stan. Sect. B. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  15. Jamil, M., Yang, X.: A literature survey of benchmark functions for global optimization problems. Int. J. Math. Model Numer. Optim. 4, 150–194 (2013)

    MATH  Google Scholar 

  16. Kaelo, P.: A hybrid nonlinear conjugate gradient method for unconstrained optimization problems. Pac. J. Optim. 12, 847–859 (2016)

    MathSciNet  MATH  Google Scholar 

  17. Kaelo, P., Mtagulwa, P., Thuto, M.V.: A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization. Math. Sci. (Springer) 14, 1–9 (2020)

    Article  MathSciNet  Google Scholar 

  18. Li, M.: A three term Polak-Ribi\(\grave{\text{ e }}\)re-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. J. Ind. Manag. Optim. 16, 245–260 (2020)

    MathSciNet  MATH  Google Scholar 

  19. Li, M., Qu, A.: A sufficient descent Liu-Storey conjugate gradient method and its global convergence. Optimization 64, 1919–1934 (2015)

    Article  MathSciNet  Google Scholar 

  20. Li, X., Wang, X., Duan, X.: A limited memory BFGS method for solving large-scale symmetric nonlinear equations. Abstr. Appl. Anal. 2014, 716019 (2014)

    MathSciNet  MATH  Google Scholar 

  21. Li, Y., Yuan, G., Wei, Z.: A limited-memory BFGS algorithm based on a trust-region quadratic model for large-scale nonlinear equations. PLoS One 10, 1–13 (2015)

    Google Scholar 

  22. Lin, J., Jiang, C.: An improved conjugate gradient parametric detection based on space-time scan. Signal Process. 169, 107412 (2020)

    Article  Google Scholar 

  23. Liu, J., Du, S.: Modified three-term conjugate gradient method and its applications. Math. Prob. Eng. 2019, 5976595 (2019)

    MathSciNet  MATH  Google Scholar 

  24. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    Article  MathSciNet  Google Scholar 

  25. Liu, J.K., Zhao, Y.X., Wua, X.L.: Some three-term conjugate gradient methods with the new direction structure. Appl. Numer. Math. 150, 433–443 (2020)

  26. McDougall, T.J., Wotherspoon, S.J., Barker, P.M.: An accelerated version of Newton’s method with convergence order \(\sqrt{3}+1\). Results Appl. Math. 4, 100078 (2019)

    Article  Google Scholar 

  27. Mtagulwa, P., Kaelo, P.: An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems. Appl. Numer. Math. 145, 111–120 (2019)

    Article  MathSciNet  Google Scholar 

  28. Polak, E., Ribiére, G.: Note sur la covergence de directions conjugées. Rev. Fren. Inf. Rech. Oper., 3e Année 16, 35–43 (1969)

  29. Polyak, T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)

    Article  Google Scholar 

  30. Ramos, H., Monteiro, M.T.T.: A new approach based on the Newton’s method to solve systems of nonlinear equations. J. Comput. Appl. Math. 318, 3–13 (2017)

    Article  MathSciNet  Google Scholar 

  31. Wu, G., Li, Y., Yuan, G.: A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems. Math. Probl. Eng. 2018, 4813030 (2018)

    MathSciNet  MATH  Google Scholar 

  32. Xiao, Y., Wei, Z., Wang, Z.: A limited memory BFGS-type method for large-scale unconstrained optimization. Comput. Math. Appl. 56, 1001–1009 (2008)

    Article  MathSciNet  Google Scholar 

  33. Yu, G., Guan, L.: Modified PRP methods with sufficient descent property and their convergence properties. Acta Sci. Nat. Uni. Sun. 45, 11–14 (2006)

    MATH  Google Scholar 

  34. Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large scale optimization problems. Optim. Lett. 3, 11–21 (2009)

    Article  MathSciNet  Google Scholar 

  35. Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    Article  MathSciNet  Google Scholar 

  36. Yuan, G., Wei, Z., Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  Google Scholar 

  37. Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the Editor and the Referees for their valuable suggestions and comments which led to the improvement of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to P. Kaelo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Diphofu, T., Kaelo, P. Another Three-Term Conjugate Gradient Method Close to the Memoryless BFGS for Large-Scale Unconstrained Optimization Problems. Mediterr. J. Math. 18, 211 (2021). https://doi.org/10.1007/s00009-021-01853-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00009-021-01853-y

Keywords

Mathematics Subject Classification

Navigation