Skip to main content
Log in

Proximal operator and optimality conditions for ramp loss SVM

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Support vector machines with ramp loss (\(L_r\)-SVM) have attracted considerable attention due to the robustness of the ramp loss. However, the corresponding optimization problem is non-convex, and the given Karush–Kuhn–Tucker (KKT) conditions are only first-order necessary conditions. To enrich the optimality theory of \(L_r\)-SVM, we first introduce and analyze the proximal operator for the ramp loss, and then establish a stronger optimality condition: P-stationarity, which is proved to be the first-order necessary and sufficient conditions for the local minimizer of \(L_r\)-SVM. Finally, we define the P-support vectors based on the P-stationary point and show that under mild conditions, all of the P-support vectors for \(L_r\)-SVM are on the two support hyperplanes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Bartlett, P.L., Mendelson, S.: Rademacher and gaussian complexities: risk bounds and structural results. J. Mach. Learn. Res. 3, 463–482 (2002)

    MathSciNet  MATH  Google Scholar 

  2. Beck, A.: First-Order Methods in Optimization. SIAM Publisher, Philadelphia (2017)

    Book  Google Scholar 

  3. Brooks, J.P.: Support vector machines with the ramp loss and the hard margin loss. Oper. Res. 59(2), 467–479 (2011)

    Article  MathSciNet  Google Scholar 

  4. Carrizosa, E., Martin-Barragan, B., Morales, D.R.: Binarized support vector machines. Informs J. Comput. 22(1), 154–167 (2010)

    Article  MathSciNet  Google Scholar 

  5. Carrizosa, E., Morales, D.R.: Supervised classification and mathematical optimization. Comput. Oper. Res. 40(1), 150–165 (2013)

    Article  MathSciNet  Google Scholar 

  6. Carrizosa, E., Nogales-Gmez, A., Romero, M.D.: Heuristic approaches for support vector machines with the ramp loss. Optim. Lett. 8(3), 1125–1135 (2014)

    Article  MathSciNet  Google Scholar 

  7. Collobert, R., Sinz, F., Weston, J., Bottou, L.: Trading convexity for scalability. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 201–208 (2006)

  8. Collobert, R., Sinz, F., Weston, J., Bottou, L.: Large scale transductive SVMs. J. Mach. Learn. Res. 7, 1687–1712 (2006)

    MathSciNet  MATH  Google Scholar 

  9. Cortes, C., Vapnik, V.: Support vector networks. Mach. Learn. 20(3), 273–297 (1995)

    MATH  Google Scholar 

  10. Ertekin, S., Bottou, L., Giles, C.L.: Nonconvex online support vector machines. IEEE Trans. Pattern Anal. Mach. Intell. 33(2), 368–381 (2011)

    Article  Google Scholar 

  11. Gaudioso, M., Gorgone, E., Hiriart-Urruty, J.B.: Feature selection in SVM via polyhedral k-norm. Optim. Lett. 14(1), 19–36 (2020)

    Article  MathSciNet  Google Scholar 

  12. Hess, E.J., Brooks, J.P.: The support vector machine and mixed integer linear programming: ramp loss SVM with \(L_1\)-norm regularization. In: Proceedings of the 14th Informs Computing Society Conference Richmond, pp. 226–235 (2015)

  13. Horn, R.A., Johnson, C.R.: Matrix Analysis. Cambridge University Press, Cambridge (2013)

    MATH  Google Scholar 

  14. Khemchandani, R., Suresh, C.: Optimal kernel selection in twin support vector machines. Optim. Lett. 3(1), 77–88 (2009)

    Article  MathSciNet  Google Scholar 

  15. Montaes, D.C., Quiroz, A.J., Dulcerubio, M.D., Riascosvillegas, A.J.: Efficient nearest neighbors methods for support vector machines in high dimensional feature spaces. Optim. Lett. (2020). https://doi.org/10.1007/s11590-020-01616-w

    Article  Google Scholar 

  16. Moreau, J.J.: Fonctions convexes duales et points proximaux dans un espace hilbertien. C.R.Acad.Sci.Paris. 255, 2897–2899 (1962)

  17. Polyak, R., Ho, S.S., Griva, I.: Support vector machine via nonlinear rescaling method. Optim. Lett. 1(4), 367–378 (2007)

    Article  MathSciNet  Google Scholar 

  18. Rockafellar, R.T., Wets, R.J.: Variational analysis. Springer Science and Business Media (2009)

  19. Shen, X.T., Tseng, G.C., Zhang, X.G., Wong, W.H.: On \(\psi \)-Learning. J. Am. Stat. Assoc. 98(1), 724–734 (2003)

    Article  Google Scholar 

  20. Wang, X.M., Fan, N., Pardalos, P.M.: Stochastic subgradient descent method for large-scale robust chance-constrained support vector machines. Optim. Lett. 11(5), 1013–1024 (2017)

    Article  MathSciNet  Google Scholar 

  21. Wang, Z., Vucetic, S.: Fast online training of ramp loss support vector machines. In: Ninth IEEE International Conference on Data Mining, pp. 569–577 (2009)

  22. Wu, Y., Liu, Y.: Robust truncated hinge loss support vector machines. J. Am. Stat. Assoc. 102, 974–983 (2007)

    Article  MathSciNet  Google Scholar 

  23. Xu, L., Crammer, K., Schuurmans, D.: Robust support vector machine training via convex outlier ablation. In: Proceedings of the 21st National Conference on Artificial Intelligence, pp. 536–542 (2006)

Download references

Acknowledgements

The authors would like to thank the associate editor and two anonymous referees for their constructive comments, which have significantly improved the quality of the paper. This work was supported by the National Natural Science Foundation of China (11971052, 11926348-9, 61866010, 11871183), and the Natural Science Foundation of Hainan Province (120RC449).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huajun Wang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, H., Shao, Y. & Xiu, N. Proximal operator and optimality conditions for ramp loss SVM. Optim Lett 16, 999–1014 (2022). https://doi.org/10.1007/s11590-021-01756-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-021-01756-7

Keywords

Navigation