Skip to main content
Log in

Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

This work investigates finite differences and the use of (diagonal) quadratic interpolation models to obtain approximations to the first and (non-mixed) second derivatives of a function. Here, it is shown that if a particular set of points is used in the interpolation model, then the solution to the associated linear system (i.e., approximations to the gradient and diagonal of the Hessian) can be obtained in \(\mathcal {O}(n)\) computations, which is the same cost as finite differences, and is a saving over the \(\mathcal {O}(n^{3})\) cost when solving a general unstructured linear system. Moreover, if the interpolation points are chosen in a particular way, then the gradient approximation is \(\mathcal {O}(h^{2})\) accurate, where h is related to the distance between the interpolation points. Numerical examples confirm the theoretical results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. The work in [18] appeared after this work.

  2. Note that α and γ are functions of n, but for notational simplicity we avoid explicitly writing the dependence on n.

  3. Note that μ and ω are functions of n, but for notational simplicity we avoid explicitly writing the dependence on n.

References

  1. Audet, C., Hare, W.: Derivative-free and blackbox optimization. Springer Series in Operations Research and Financial Engineering, Springer (2017)

  2. Bandeira, A.S., Scheinberg, K., Vicente, L.N.: Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization. Mathematical Programming Series B 134, 223–257 (2012)

    Article  MathSciNet  Google Scholar 

  3. Baydin, A.G., Pearlmutter, B.A., Radul, A.A., Siskind, J.M.: Automatic differentiation in machine learning: a survey. J. Mach. Learn. Res. 18, 1–43 (2018). Editor: Léon Bottou

    MathSciNet  MATH  Google Scholar 

  4. Berahas, A.S., Cao, L., Choromanskiy, K., Scheinberg, K.: A theoretical and empirical comparison of gradient approximations in derivative-free optimization. Tech. rep., Department of Industrial and Systems Engineering, Lehigh University, Bethlehem, PA, USA. arXiv:1905.01332v2 [math.OC] (2019)

  5. Cocchi, G., Liuzzi, G., Papini, A., Sciandrone, M.: An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints. Comput. Optim. Appl. 69(2), 267–296 (2018)

    Article  MathSciNet  Google Scholar 

  6. Conn, A., Scheinberg, K., Vicente, L.: Introduction to Derivative-Free Optimization. MPS-SIAM Series on Optimization, Philadelphia (2009)

    Book  Google Scholar 

  7. Conn, A., Scheinberg, K., Toint, P.: Recent progress in unconstrained nonlinear optimization without derivatives. Math. Program. 79, 397–414 (1997)

    MathSciNet  MATH  Google Scholar 

  8. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of interpolation sets in derivative free optimization. Mathematical Programming Series B 111, 141–172 (2008)

    Article  MathSciNet  Google Scholar 

  9. Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28, 721–748 (2008)

    Article  MathSciNet  Google Scholar 

  10. Conn, A.R., Toint, P.L.: An algorithm using quadratic interpolation for unconstrained derivative free optimization. In: Di Pillo, G., Giannessi, F. (eds.) Nonlinear Optimization and Applications, pp. 27–47. Springer US, Boston, MA (1996)

  11. Coope, I., Price, C.: Frame-based methods for unconstrained optimization. J. Optim. Theory Appl. 107(2), 261–274 (2000)

    Article  MathSciNet  Google Scholar 

  12. Coope, I.D., Tappenden, R.: Efficient calculation of regular simplex gradients. Comput. Optim. Appl. 72(3), 561–588 (2019). https://doi.org/10.1007/s10589-019-00063-3

    Article  MathSciNet  MATH  Google Scholar 

  13. Fasano, G., Morales, J.L., Nocedal, J.: On the geometry phase in model-based algorithms for derivative-free optimization. Optimization Methods and Software 24(1), 145–154 (2009)

    Article  MathSciNet  Google Scholar 

  14. Fazel, M., Ge, R., Kakade, S., Mesbahi, M.: Global convergence of policy gradient methods for the linear quadratic regulator. Proc. Mach. Learn. Res. (PMLR) 80, 1467–1476 (2018). International Conference on Machine Learning, 10–15, July 2018, Stockholmsmässan, Stockholm, Sweden

    Google Scholar 

  15. Gilmore, P., Kelley, C.: An implicit filtering algorithm for optimization of functions with many local minima. SIAM J. Optim. 5(2), 269–285 (1995)

    Article  MathSciNet  Google Scholar 

  16. Gilmore, P., Kelley, C.T., Miller, C.T., Williams, G.A.: Implicit filtering and optimal design problems. In: Borggaard, J., Burkardt, J., Gunzburger, M., Peterson, J. (eds.) Optimal Design and Control, pp. 159–176. Birkhäuser, Boston (1995)

  17. Hare, W., Jaberipour, M.: Adaptive interpolation strategies in derivative-free optimization: a case study. Tech. Rep., University of British Colombia, Canada, and Amirkabir University of Technology, Iran. arXiv:1511.02794v1 [math.OC] (2015)

  18. Hare, W., Jarry-Bolduc, G., Planiden, C.: Error bounds for overdetermined and underdetermined generalized centred simplex gradients. Tech. Rep., University of British Colombia, Canada, and University of Wollongong, Australia. arXiv:2006.00742v1 [math.NA] (2020)

  19. Hoffmann, P.H.W.: A hitchhiker’s guide to automatic differentiation. Numerical Algorithms 72(3), 775–811 (2016). https://doi.org/10.1007/s11075-015-0067-6

    Article  MathSciNet  MATH  Google Scholar 

  20. Jarry-Bolduc, G., Nadeau, P., Singh, S.: Uniform simplex of an arbitrary orientation. Optim. Lett. Published online 03, July 2019, https://doi.org/10.1007/s11590-019-01448-3 (2019)

  21. Maggiar, A., Wächter, A., Dolinskaya, I.S., Staum, J.: A derivative-free trust-region algorithm for the optimization of functions smoothed via gaussian convolution using adaptive multiple importance sampling. SIAM J. Optim. 28(2), 1478–1507 (2018)

    Article  MathSciNet  Google Scholar 

  22. Margossian, C.C.: A review of automatic differentiation and its efficient implementation. Tech. Rep., Department of Statistics, Columbia University. arXiv:1811.05031v2 [cs.MS] (2019)

  23. Nelder, J., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)

    Article  MathSciNet  Google Scholar 

  24. Nesterov, Y., Spokoiny, V.: Random gradient-free minimization of convex functions. Found. Comput. Math. 17(2), 527–566 (2017)

    Article  MathSciNet  Google Scholar 

  25. Nocedal, J., Wright, S.J.: Numerical Optimization, 2 ed. Springer Series in Operations Research, Springer (2006)

  26. Spendley, W., Hext, G., Himsworth, F.: Sequential application of simplex designs in optimisation and evolutionary operation. Technometrics 4, 441–461 (1962)

    Article  MathSciNet  Google Scholar 

  27. Wild, S.M., Shoemaker, C.: Global convergence of radial basis function trust-region algorithms for derivative-free optimization. SIAM Rev. 55 (2), 349–371 (2013)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rachael Tappenden.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Coope, I.D., Tappenden, R. Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases. Numer Algor 88, 767–791 (2021). https://doi.org/10.1007/s11075-020-01056-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-020-01056-8

Keywords

Navigation