Skip to main content
Log in

Nonlinear dimension reduction for conditional quantiles

  • Regular Article
  • Published:
Advances in Data Analysis and Classification Aims and scope Submit manuscript

Abstract

In practice, data often display heteroscedasticity, making quantile regression (QR) a more appropriate methodology. Modeling the data, while maintaining a flexible nonparametric fitting, requires smoothing over a high-dimensional space which might not be feasible when the number of the predictor variables is large. This problem makes necessary the use of dimension reduction techniques for conditional quantiles, which focus on extracting linear combinations of the predictor variables without losing any information about the conditional quantile. However, nonlinear features can achieve greater dimension reduction. We, therefore, present the first nonlinear extension of the linear algorithm for estimating the central quantile subspace (CQS) using kernel data. First, we describe the feature CQS within the framework of reproducing kernel Hilbert space, and second, we illustrate its performance through simulation examples and real data applications. Specifically, we emphasize on visualizing various aspects of the data structure using the first two feature extractors, and we highlight the ability to combine the proposed algorithm with classification and regression linear algorithms. The results show that the feature CQS is an effective kernel tool for performing nonlinear dimension reduction for conditional quantiles.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Aizerman M, Braverman E, Rozonoer L (1964) Theoretical foundations of the potential function method in pattern recognition learning. Autom Remote Control 25:821–837

    MATH  Google Scholar 

  • Akaho S (2001) Kernel method for canonical correlation analysis. In Proceedings of the international meeting of the psychometric society (IMPS2001)

  • Aronszajn N (1950) Theory of reproducing kernels. Trans Am Math Soc 68(3):337–404

    Article  MathSciNet  Google Scholar 

  • Bach FR, Jordan MI (2002) Kernel independent component analysis. J Mach Learn Res 3:1–48

    MathSciNet  MATH  Google Scholar 

  • Baudat G, Annouar F (2000) Generalized discriminant analysis using a kernel approach. Neural Comput 12(10):2385–2404

    Article  Google Scholar 

  • Chaudhuri P (1991) Nonparametric estimates of regression quantiles and their local Bahadur representation. Ann Stat 19(2):760–777

    Article  MathSciNet  Google Scholar 

  • Christou E (2020) Central quantile subspace. Stat Comput 30:677–695

    Article  MathSciNet  Google Scholar 

  • Christou E, Akritas MG (2016) Single index quantile regression for heteroscedastic data. J Multivar Anal 150:169–182

    Article  MathSciNet  Google Scholar 

  • Duan K, Keerthi SS, Poo AN (2003) Evaluation of simple performance measures for tuning SVM hyperparameters. Neurocomputing 51:41–59

    Article  Google Scholar 

  • Fukumizu K, Bach FR, Gretton A (2007) Statistical consistency of kernel canonical correlation analysis. J Mach Learn Res 8:361–383

    MathSciNet  MATH  Google Scholar 

  • Guerre E, Sabbah C (2012) Uniform bias study and Bahadur representation for local polynomial estimators of the conditional quantile function. Econom Theory 28(1):87–129

    Article  MathSciNet  Google Scholar 

  • Harrison D, Rubinfeld DL (1978) Hedonic prices and the demand for clean air. J Environ Econ Manag 5:81–102

    Article  Google Scholar 

  • Hashem H, Vinciotti V, Alhamzawi R, Keming Y (2016) Quantile regression with group lasso for classification. Adv Data Anal Classif 10:375–390

    Article  MathSciNet  Google Scholar 

  • Keerthi SS, Lin CJ (2003) Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Comput 15:1667–1689

    Article  Google Scholar 

  • Koenker R, Bassett G (1978) Regression quantiles. Econometrica 46(1):33–50

    Article  MathSciNet  Google Scholar 

  • Koenker R, Machado J (1999) Goodness of fit and related inference processes for quantile regression. J Am Stat Assoc 94(448):1296–1310

    Article  MathSciNet  Google Scholar 

  • Kong E, Xia Y (2012) A single-index quantile regression model and its estimation. Econom Theory 28(4):730–768

    Article  MathSciNet  Google Scholar 

  • Kong E, Xia Y (2014) An adaptive composite quantile approach to dimension reduction. Ann Stat 42(4):1657–1688

    Article  MathSciNet  Google Scholar 

  • Kong E, Linton O, Xia Y (2010) Uniform Bahadur representation for local polynomial estimates of M-regression and its application to the additive model. Econom Theory 26(5):1529–1564

    Article  MathSciNet  Google Scholar 

  • Kordas G (2006) Smoothed binary regression quantiles. J Appl Econ 21(3):387–407

    Article  MathSciNet  Google Scholar 

  • Lai PL, Fyfe C (2000) Kernel and nonlinear canonical correlation analysis. Int J Neural Syst 10(5):365–377

    Article  Google Scholar 

  • Li KC (1991) Sliced inverse regression for dimension reduction. J Am Stat Assoc 86(414):316–327

    Article  MathSciNet  Google Scholar 

  • Li B, Artemiou A, Li L (2011) Principal support vector machines for linear and nonlinear sufficient dimension reduction. Ann Stat 39(6):3182–3210

    MathSciNet  MATH  Google Scholar 

  • Luo W, Li B, Yin X (2014) On efficient dimension reduction with respect to a statistical functional of interest. Ann Stat 42(1):382–412

    Article  MathSciNet  Google Scholar 

  • Mika S, Rätsch G, Weston J, Schölkopf B, Müller KR (1999) Fisher discriminant analysis with kernel. Proceed IEEE Neural Netw Signal Proc Workshop 9:41–48

    Google Scholar 

  • Opsomer JD, Ruppert D (1998) A fully automated bandwidth selection method for fitting additive models. J Am Stat Assoc 93(442):605–619

    Article  MathSciNet  Google Scholar 

  • Roth V, Steinhage V (2000) Nonlinear discriminant analysis using kernel functions. In advances in neural information processing systems, pages 568–574, MIT Press, Cambridge

  • Schölkopf B, Smola AJ, Müller KR (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10(5):1299–1319

    Article  Google Scholar 

  • Schölkopf B, Smola AJ, Müller KR (1999) Kernel principal component analysis. Advances in kernel methods: support vector learning, pp. 327–352, MIT Press, Cambridge

  • Schölkopf B, Tsuda K, Vert JP (eds) (2004) Kernel methods in computational biology. MIT Press, Cambridge

    MATH  Google Scholar 

  • Sigillito VG, Wing SP, Hutton LV, Baker KB (1989) Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Tech Dig 10:262–266

    Google Scholar 

  • Takeuchi I, Le QV, Sears T, Smola AJ (2006) Nonparametric quantile regression. J Mach Learn Res 7:1231–1264

    MathSciNet  MATH  Google Scholar 

  • Truong YK (1989) Asymptotic properties of kernel estimators based on local medians. Ann Stat 17(2):606–617

    Article  MathSciNet  Google Scholar 

  • Wang C, Shin SJ, Wu Y (2018) Principal quantile regression for sufficient dimension reduction with heteroscedasticity. Electron J Stat 12:2114–2140

    MathSciNet  MATH  Google Scholar 

  • Wu HM (2008) Kernel sliced inverse regression with applications to classification. J Comput Graph Stat 17(3):590–610

    Article  MathSciNet  Google Scholar 

  • Wu TZ, Yu K, Yu Y (2010) Single-index quantile regression. J Multivar Anal 101(7):1607–1621

    Article  MathSciNet  Google Scholar 

  • Wu Q. Liang F, Mukherjee S (2013) Kernel sliced inverse regression: regularization and consistency. Abs Appl Anal, Volume 2013, Special Issue, Article ID 540725, 11 pages

  • Yeh YR, Huang SY, Lee YJ (2009) Nonlinear dimension reduction with kernel sliced inverse regression. IEEE Trans Knowl Data Eng 21(11):1590–1603

    Article  Google Scholar 

  • Yu K, Jones MC (1998) Local linear quantile regression. J Am Stat Assoc 93(441):228–237

    Article  MathSciNet  Google Scholar 

  • Zhu LP, Zhu LX, Feng ZH (2010) Dimension reduction in regression through cumulative slicing estimation. J Am Stat Assoc 105(492):1455–1466

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We wish to thank the authors of Hashem et al. (2016) for providing us the R code for performing group lasso for binary classification. We also want to thank the anonymous referees, whose comments lead to improvements in the presentation of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eliana Christou.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Christou, E., Settle, A. & Artemiou, A. Nonlinear dimension reduction for conditional quantiles. Adv Data Anal Classif 15, 937–956 (2021). https://doi.org/10.1007/s11634-021-00439-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11634-021-00439-6

Keywords

Navigation