Abstract
In this paper, we propose a new three term conjugate gradient method, in which the search direction is close to the direction in the memoryless BFGS method. The global convergence of the method is established under a modified Wolfe line search. Results of numerical experiments presented confirm that the three term method is effective and superior to some other conjugate gradient methods if the standard Wolfe line search strategy is used. Furthermore, the method produces a set of even better results when implemented under a modified Wolfe line search.
Similar content being viewed by others
References
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)
Andrei, N.: A double parameter scaled BFGS method for unconstrained optimization. J. Comput. Appl. Math. 332, 26–44 (2018)
Andrei, N.: A double parameter self-scaling memoryless BFGS method for unconstrained optimization. Comput. Appl. Math. 39, 159 (2020)
Baluch, B., Salleh, Z., Alhawarat, A.: A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence. J. Optim. 2018, 5057096 (2018)
Dai, Y.H., Yuan, Y.: A non-linear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient. Math. Optim. 43, 87–101 (2001)
Dehghani, R., Hosseini, M.M., Bidabadi, N.: The modified quasi-Newton methods for solving unconstrained optimization problems. Int. J. Numer. Model. 32, e2459 (2019)
Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Prog. 91, 201–213 (2002)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradient. Comput. J. 7, 149–154 (1964)
Fetcher, R.: Practical Methods of Optimization vol. 1: Unconstrained Optimization. Wiley, New York (1987)
Gutiérrez, J.M., Hernández-Verón, M.Á.: An acceleration of the continuous Newton’s method. J. Comput. Appl. Math. 354, 213–220 (2019)
Hager, W.W., Zhang, H.C.: A new conjugate gradient with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hestenes, M., Steifel, E.: Method of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stan. Sect. B. 49, 409–436 (1952)
Jamil, M., Yang, X.: A literature survey of benchmark functions for global optimization problems. Int. J. Math. Model Numer. Optim. 4, 150–194 (2013)
Kaelo, P.: A hybrid nonlinear conjugate gradient method for unconstrained optimization problems. Pac. J. Optim. 12, 847–859 (2016)
Kaelo, P., Mtagulwa, P., Thuto, M.V.: A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization. Math. Sci. (Springer) 14, 1–9 (2020)
Li, M.: A three term Polak-Ribi\(\grave{\text{ e }}\)re-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. J. Ind. Manag. Optim. 16, 245–260 (2020)
Li, M., Qu, A.: A sufficient descent Liu-Storey conjugate gradient method and its global convergence. Optimization 64, 1919–1934 (2015)
Li, X., Wang, X., Duan, X.: A limited memory BFGS method for solving large-scale symmetric nonlinear equations. Abstr. Appl. Anal. 2014, 716019 (2014)
Li, Y., Yuan, G., Wei, Z.: A limited-memory BFGS algorithm based on a trust-region quadratic model for large-scale nonlinear equations. PLoS One 10, 1–13 (2015)
Lin, J., Jiang, C.: An improved conjugate gradient parametric detection based on space-time scan. Signal Process. 169, 107412 (2020)
Liu, J., Du, S.: Modified three-term conjugate gradient method and its applications. Math. Prob. Eng. 2019, 5976595 (2019)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)
Liu, J.K., Zhao, Y.X., Wua, X.L.: Some three-term conjugate gradient methods with the new direction structure. Appl. Numer. Math. 150, 433–443 (2020)
McDougall, T.J., Wotherspoon, S.J., Barker, P.M.: An accelerated version of Newton’s method with convergence order \(\sqrt{3}+1\). Results Appl. Math. 4, 100078 (2019)
Mtagulwa, P., Kaelo, P.: An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems. Appl. Numer. Math. 145, 111–120 (2019)
Polak, E., Ribiére, G.: Note sur la covergence de directions conjugées. Rev. Fren. Inf. Rech. Oper., 3e Année 16, 35–43 (1969)
Polyak, T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)
Ramos, H., Monteiro, M.T.T.: A new approach based on the Newton’s method to solve systems of nonlinear equations. J. Comput. Appl. Math. 318, 3–13 (2017)
Wu, G., Li, Y., Yuan, G.: A three-term conjugate gradient algorithm with quadratic convergence for unconstrained optimization problems. Math. Probl. Eng. 2018, 4813030 (2018)
Xiao, Y., Wei, Z., Wang, Z.: A limited memory BFGS-type method for large-scale unconstrained optimization. Comput. Math. Appl. 56, 1001–1009 (2008)
Yu, G., Guan, L.: Modified PRP methods with sufficient descent property and their convergence properties. Acta Sci. Nat. Uni. Sun. 45, 11–14 (2006)
Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large scale optimization problems. Optim. Lett. 3, 11–21 (2009)
Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)
Yuan, G., Wei, Z., Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search. Appl. Math. Model. 47, 811–825 (2017)
Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Acknowledgements
The authors would like to thank the Editor and the Referees for their valuable suggestions and comments which led to the improvement of this paper.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Diphofu, T., Kaelo, P. Another Three-Term Conjugate Gradient Method Close to the Memoryless BFGS for Large-Scale Unconstrained Optimization Problems. Mediterr. J. Math. 18, 211 (2021). https://doi.org/10.1007/s00009-021-01853-y
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00009-021-01853-y