Abstract
In this paper, we study the complexity of the forward–backward splitting method with Beck–Teboulle’s line search for solving convex optimization problems, where the objective function can be split into the sum of a differentiable function and a nonsmooth function. We show that the method converges weakly to an optimal solution in Hilbert spaces, under mild standing assumptions without the global Lipschitz continuity of the gradient of the differentiable function involved. Our standing assumptions is weaker than the corresponding conditions in the paper of Salzo (SIAM J Optim 27:2153–2181, 2017). The conventional complexity of sublinear convergence for the functional value is also obtained under the local Lipschitz continuity of the gradient of the differentiable function. Our main results are about the linear convergence of this method (in the quotient type), in terms of both the function value sequence and the iterative sequence, under only the quadratic growth condition. Our proof technique is direct from the quadratic growth conditions and some properties of the forward–backward splitting method without using error bounds or Kurdya-Łojasiewicz inequality as in other publications in this direction.
Similar content being viewed by others
Notes
We are indebted to some remarks from one referee that allow us to extend the results from finite dimensions to Hilbert spaces in the current version.
We are grateful to one of the referees whose remarks lead us to this simple proof.
This observation comes from one of the referees.
References
Daubechies, I., Defrise, M., De Mol, D.: An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57, 1413–1457 (2004)
Hale, E.T., Yin, W., Zhang, Y.: Fixed-point continuation for \(\ell _1\)-minimization: methodology and convergence. SIAM J. Optim. 19, 1107–1130 (2008)
Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. 58, 267–288 (1996)
Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York (2011)
Bredies, K., Lorenz, D.A.: Linear convergence of iterative soft-thresholding. J. Fourier Anal. Appl. 14, 813–837 (2008)
Beck, A., Teboulle, M.: Gradient-Based Algorithms with Applications to Signal Recovery Problems. In: Palomar, D., Eldar, Y. (Eds.), Convex Optimization in Signal Processing and Communications, pp. 42–88. University Press, Cambribge (2010)
Combettes, P.L., Pesquet, J.-C.: Proximal splitting methods in signal processing. In: Fixed-Point Algorithms for Inverse Problems. Science and Engineering. Springer Optimization and Its Applications, vol. 49, pp. 185–212. Springer, New York (2011)
Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. 4, 1168–1200 (2005)
Neal, P., Boyd, S.: Proximal algorithms. Found. Trends Optim. 1, 127–239 (2014)
Davis, D., Yin, W.: Convergence rate analysis of several splitting schemes. In: Splitting Methods in Communications, Image Science, and Engineering. Scientific Computation, Springer, Cham (2016)
Tseng, P.: A modified forward–backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 38, 431–446 (2000)
Bello Cruz, J.Y., Nghia, T.T.A.: On the convergence of the proximal forward–backward splitting method with linesearches. Optim. Method Softw. 31, 1209–1238 (2016)
Salzo, S.: The variable metric forward–backward splitting algorithm under mild differentiability assumptions. SIAM J. Optim. 27, 2153–2181 (2017)
Bauschke, H.H., Bolte, J., Teboulle, M.: A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications. Math. Oper. Res. 42, 330–348 (2017)
Csiszár, I.: Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Ann. Statist. 19, 2032–2066 (1991)
Vardi, Y., Shepp, L.A., Kaufman, L.: A statistical model for positron emission tomography. J. Amer. Statist. Assoc. 80, 8–37 (1985)
Bello-Cruz, J.Y., Li, G., Nghia, T.T.A.: On the linear convergence of forward-backward splitting method. Quadratic growth condition and uniqueness of optimal solution to Lasso, preprint, Part II (2020)
Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.W.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Program. 165, 471–507 (2017)
Garrigos, G., Rosasco, L., Villa, S.: Convergence of the forward-backward algorithm: beyond the worst case with the help of geometry, arXiv:1703.09477 (2017)
Garrigos, G., Rosasco, L., Villa, S.: Thresholding gradient methods in Hilbert spaces: support identification and linear convergence, arXiv:1712.00357 (2017)
Li, G., Pong, T.K.: Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods. Found. Comp. Math. 18, 1199–1232 (2018)
Li, G., Mordukhovich, B.S., Nghia, T.T.A., Pham, T.S.: Error bounds for parametric polynomial systems with applications to higher-order stability analysis and convergence rates. Math. Program. 168, 313–346 (2018)
Zhou, Z., So, A.M.-C.: A unified approach to error bounds for structured convex optimization. Math. Program. 165, 689–728 (2017)
Li, G.: Global error bounds for piecewise convex polynomials. Math. Program. 137(1–2), 37–64 (2013)
Drusvyatskiy, D., Lewis, A.: Error bounds, quadratic growth, and linear convergence of proximal methods. Math. Oper. Res. 43, 693–1050 (2018)
Necoara, I., Nesterov, Yu., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program. 175, 69–107 (2019)
Tao, S., Boley, D., Zhang, S.: Local linear convergence of ISTA and FISTA on the Lasso problem. SIAM J. Optim. 26, 313–336 (2016)
Luo, Z.-Q., Tseng, P.: Error bounds and convergence analysis of feasible descent methods: a general approach. Ann. Oper. Res. 46, 157–178 (1993)
Azé, D., Corvellec, J.-N.: Nonlinear local error bounds via a change of metric. J. Fixed Point Theory Appl. 16, 251–372 (2014)
Aragón Artacho, F.J., Geoffroy, M.H.: Characterizations of metric regularity of subdifferentials. J. Convex Anal. 15, 365–380 (2008)
Drusvyatskiy, D., Mordukhovich, B.S., Nghia, T.T.A.: Second-order growth, tilt stability, and metric regularity of the subdifferential. J. Convex Anal. 21, 1165–1192 (2014)
Liang, J., Fadili, J., Peyré, G.: Local linear convergence of forward-backward under partial smoothness. Adv. Neural Inf. Process Syst. (2014)
Liang, J., Fadili, J., Peyré, G.: Activity identification and local linear convergence of forward–backward type methods. SIAM J. Optim. 27, 408–437 (2017)
Aragón Artacho, F.J., Geoffroy, M.H.: Metric subregularity of the convex subdifferential in Banach spaces. J. Nonlinear Convex Anal. 15, 35–47 (2014)
Tseng, P., Yun, S.: A coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117, 387–423 (2000)
Rockafellar, R.T., Wets, R.J.-B.: Variational Analysis. Springer, Berlin (1998)
Dontchev, A.L., Rockafellar, R.T.: Implicit Functions and Solution Mappings. A View from Variational Analysis. Springer, Dordrecht (2009)
Burachik, R.S., Iusem, A.N.: Set-Valued Mappings and Enlargements of Monotone Operators. Springer, Berlin (2008)
Acknowledgements
This work was partially supported by the National Science Foundation (NSF) Grants DMS—1816386 and 1816449, and a Discovery Project from Australian Research Council (ARC), DP190100555. The authors are deeply grateful to both anonymous referees for their careful readings and thoughtful suggestions that allowed us to improve the original presentation significantly. Many insightful remarks from one referee allow us to rewrite Section 3 in infinite dimensional spaces.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Bello-Cruz, Y., Li, G. & Nghia, T.T.A. On the Linear Convergence of Forward–Backward Splitting Method: Part I—Convergence Analysis. J Optim Theory Appl 188, 378–401 (2021). https://doi.org/10.1007/s10957-020-01787-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-020-01787-7
Keywords
- Nonsmooth and convex optimization problems
- Forward–Backward splitting method
- Linear convergence
- Quadratic growth condition