Skip to main content
Log in

Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems

  • Published:
Annals of Mathematics and Artificial Intelligence Aims and scope Submit manuscript

Abstract

In this paper, an efficient implicit Lagrangian twin parametric insensitive support vector regression is proposed which leads to a pair of unconstrained minimization problems, motivated by the works on twin parametric insensitive support vector regression (Peng: Neurocomputing. 79, 26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer: Neural Comput. Applic. 22(1), 257–267, 2013). Since its objective function is strongly convex, piece-wise quadratic and differentiable, it can be solved by gradient-based iterative methods. Notice that its objective function having non-smooth ‘plus’ function, so one can consider either generalized Hessian, or smooth approximation function to replace the ‘plus’ function and further apply the simple Newton-Armijo step size algorithm. These algorithms can be easily implemented in MATLAB and do not require any optimization toolbox. The advantage of this method is that proposed algorithms take less training time and can deal with data having heteroscedastic noise structure. To demonstrate the effectiveness of the proposed method, computational results are obtained on synthetic and real-world datasets which clearly show comparable generalization performance and improved learning speed in accordance with support vector regression, twin support vector regression, and twin parametric insensitive support vector regression.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Azamathulla, M.H., Ghani, A.A., Chang, C.K., Hasan, Z.A., Zakaria, N.A.: Machine learning approach to predict sediment load–a case study. CLEAN–Soil Air Water. 38(10), 969–976 (2010)

    Article  Google Scholar 

  2. Balasundaram S., Gupta D., Prasad S.C.: A new approach for training Lagrangian twin support vector machine via unconstrained convex minimization. Appl Intell. 46(1), 124–134 (2017)

  3. Box, G.E.P., Jenkins, G.M.: Time Series Analysis: Forecasting and Control. Holden-Day, San Francisco (1976)

    MATH  Google Scholar 

  4. Chen, X., Yang, J., Liang, J., Ye, Q.: Smooth twin support vector regression. Neural Comput. Applic. 21, 505–513 (2012)

    Article  Google Scholar 

  5. Cimen, M.: Estimation of daily suspended sediments using support vector machines Hydrol. Sci. J. 53(3), 656–666 (2008)

    Google Scholar 

  6. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)

    MATH  Google Scholar 

  7. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel Based Learning Methods. Cambridge University Press, Cambridge (2000)

    Book  Google Scholar 

  8. Demiriz, A., Bennett, K., Breneman, C., and Embrechts, M.: Support Vector Machine Regression in Chemometrics. Computing Science and Statistics (2001)

  9. Demsar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  10. Franc, V., Hlaváč, V.: An iterative algorithm learning the maximal margin classifier. Pattern Recogn. 36(9), 1985–1996 (2003)

    Article  Google Scholar 

  11. Fung, G., Mangasarian, O.L.: Finite Newton method for Lagrangian support vector machine. Neurocomputing. 55, 39–55 (2003)

    Article  Google Scholar 

  12. Garcia, S., Herrera, F.: An extension on statistical comparisons of classifiers over multiple data sets for all pairwise comparisons. J. Mach. Learn. Res. 9, 2677–2694 (2008)

    MATH  Google Scholar 

  13. Garcia G.N., Ebrahimi T., Vesin J.M.: Support vector EEG classification in the Fourier and time-frequency correlation domains. In First International IEEE EMBS Conference on Neural Engineering, 2003. Conference Proceedings, (pp. 591–594) (2003)

  14. Gretton, A., Doucet, A., Herbrich, R., Rayner, P.J.W., & Scholkopf, B., (2001). Support Vector Regression for Black-Box System Identification. In Proceedings of the 11th IEEE Workshop on Statistical Signal Processing

  15. Gupta, D.: Training primal K-nearest neighbor based weighted twin support vector regression via unconstrained convex minimization. Applied Intell., 1–30 (2017)

  16. Hao, P.: New support vector algorithms with parametric insensitive/margin model. Neural Netw. 23(1), 60–73 (2010)

    Article  Google Scholar 

  17. Hiriart-Urruty, J.B., Strodiot, J.J., Nguyen, V.H.: Generalized hessian matrix and second-order optimality conditions for problems with C 1, 1 data. Appl. Math. Optimiz. 11(1), 43–56 (1984)

    Article  MathSciNet  Google Scholar 

  18. Jayadeva, Khemchandani R., Chandra, S.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)

  19. Khemchandani, R., Goyal, K., Chandra, S.: TWSVR: regression via twin support vector machine. Neural Netw. 74, 14–21 (2016)

    Article  Google Scholar 

  20. Lee, Y.J., Mangasarian, O.L.: SSVM: a smooth support vector machine for classification. Comput. Optim. Appl. 20(1), 5–22 (2001)

    Article  MathSciNet  Google Scholar 

  21. Mangasarian, O. L.: Nonlinear Programming. Soc. Industr. Appl. Math. (1994)

  22. Mangasarian, O.L.: A finite Newton method for classification. Optimiz. Meth. Softw. 17, 913–929 (2002)

    Article  MathSciNet  Google Scholar 

  23. Mangasarian O.L., Musicant D.R.: Lagrangian support vector machines. J Mach Learn Res 1, 161–177 (2001)

  24. Mangasarian, O.L., Wild, E.W.: Multisurface proximal support vector classification via generalized eigenvalues. IEEE Trans. Pattern Anal. Mach. Intell. 28(1), 69–74 (2006)

    Article  Google Scholar 

  25. Mukherjee, S., Osuna, E., & Girosi, F.: Nonlinear prediction of chaotic time series using support vector machines, In: NNSP’97: Neural Networks for Signal Processing VII: Proc. of IEEE Signal Processing Society Workshop, Amelia Island, FL, USA, pp.511–520 (1997)

  26. Muller, K.R., Smola, A.J., Ratsch, G., Schölkopf, B., Kohlmorgen, J.: Using support vector machines for time series prediction. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods- Support Vector Learning, pp. 243–254. MIT Press, Cambridge (1999)

    Google Scholar 

  27. Murphy P.M., & Aha D.W.: UCI Repository of Machine Learning Databases, University of California, Irvine. http://www.ics.uci.edu/~mlearn (1992)

  28. Osuna E., Freund R, Girosi F.: Training support vector machines: an application to face detection. In Computer vision and pattern recognition, 1997. Proceedings., 1997 IEEE computer society conference on (pp. 130–136). IEEE (1997)

  29. Peng, X.: TSVR: an efficient twin support vector machine for regression. Neural Netw. 23(3), 365–372 (2010a)

    Article  Google Scholar 

  30. Peng, X.: Primal twin support vector regression and its sparse approximation. Neurocomputing. 73, 2846–2858 (2010b)

    Article  Google Scholar 

  31. Peng X.: Building sparse twin support vector machine classifiers in primal space. Inf Sci 181(18), 3967–3980 (2011)

  32. Peng, X.: Efficient twin parametric insensitive support vector regression model. Neurocomputing. 79, 26–38 (2012)

    Article  Google Scholar 

  33. Platt, J.C.: Sequential minimal optimization: a fast algorithm for training support vector machines. In: Scholkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning. MIT press, Cambridge (1998)

    Google Scholar 

  34. Rasmussen C.E., Neal R.M., Hinton G., Van Camp D., Revow M., Ghahramani Z., Kustra R., Tibshirani R. Delve data for evaluating learning in valid experiments. http://www.cs.toronto.edu/~delve. (1995–1996)

  35. Richhariya, B., Gupta, D.: Facial expression recognition using iterative universum twin support vector machine. Appl. Soft Comput. 76, 53–67 (2019)

    Article  Google Scholar 

  36. Schölkopf B., Smola A.J., Williamson R.C., Bartlett P.L. New support vector algorithms. Neural Comput 12(5), 1207–1245 (2000)

  37. Shao Y.H., Deng N.Y., Yang, Z.M.: Least squares recursive projection twin support vector machine for classification. Pattern Recognit 45(6), 2299–2307 (2012)

  38. Shao, Y., Chen, W., Zhang, J., Wang, Z., Deng, N.: An efficient weighted Lagrangian twin support vector machine for imbalanced data classification. Pattern Recogn. 47(9), 3158–3167 (2014)

  39. Shin, K.S., Lee, T.S., Kim, H.J.: An application of support vector machines in bankruptcy prediction model. Expert Syst. Appl. 28(1), 127–135 (2005)

    Article  Google Scholar 

  40. Sjoberg, J., Zhang, Q., Ljung, L., Berveniste, A., Delyon, B., Glorennec, P., Hjalmarsson, H., Juditsky, A.: Nonlinear black-box modeling in system identification: a unified overview. Automatica. 31, 1691–1724 (1995)

    Article  MathSciNet  Google Scholar 

  41. Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural. Process. Lett. 9(3), 293–300 (1999)

    Article  Google Scholar 

  42. Tao, Q., Wu, G.W., Wang, J.: A general soft method for learning SVM classifiers with L1-norm penalty. Pattern Recogn. 41(3), 939–948 (2008)

    Article  Google Scholar 

  43. Vapnik, V., Izmailov, R.: Knowledge transfer in SVM and neural networks. Ann. Math. Artif. Intell. 81, 3–19 (2017)

    Article  MathSciNet  Google Scholar 

  44. Xu, Y., Wang, L.: K-nearest neighbor-based weighted twin support vector regression. Appl. Intell. 41(1), 92–101 (2014)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deepak Gupta.

Ethics declarations

Conflict of interest

Authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gupta, D., Richhariya, B. Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems. Ann Math Artif Intell 89, 301–332 (2021). https://doi.org/10.1007/s10472-020-09708-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10472-020-09708-0

Keywords

Navigation