Skip to main content
Log in

Learning local instance correlations for multi-target regression

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Multi-target regression (MTR) refers to learning multiple relevant regression tasks simultaneously. Although much progress has been made in multi-target regression, there are still two challenging issues, that is, how to model the underlying relationships between input features and output targets, and how to explore inter-target dependencies. In this study, an effective algorithm named LLIC is proposed; it learns local instance correlations to reveal the relationships between features and output targets, and inter-target dependencies. First, an eminent instance selection method is adapted to directly work with multi-target data, constructing a collection of local instances for each instance. Then, in order to exploit the relationships between input features and output targets, and reveal inter-target dependencies, the collection of local instances is divided into two spaces, that is, a feature space and a target space. Implicit features of input features and targets are obtained in a statistical way. Finally, a final prediction model for each output target is trained on an expanded input space where the implicit features are treated as additional input variables. Extensive experiments on 18 benchmark datasets demonstrate that our proposed LLIC method can achieve competitive performance against representative state-of-the-art multi-target regression methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://mulan.sourceforge.net/datasets-mtr.html

References

  1. Breskvar M, Kocev D, Dzeroski S (2018) Ensembles for multi-target regression with random output selections. Mach Learn 107(11):1673–1709

    Article  MathSciNet  Google Scholar 

  2. Spyromitros-Xioufis E, Sechidis K, Vlahavas (2020) Multi-target regression via output space quantization. Machine Learning. arXiv:2003.09896

  3. Petkovic M, Kocev D, Dzeroski S (2020) Feature ranking for multi-target regression. Mach Learn 109:1179–1204

    Article  MathSciNet  Google Scholar 

  4. Wang J, Chen Z, Sun K, Li H, Deng X (2019) Multi-target regression via target specific features. Knowl-Based Syst 170:70–78

    Article  Google Scholar 

  5. Wang Y, Wipf DP, Ling Q, Chen W, Wassell IJ (2015) Multi-task learning for subspace segmentation. In: Proceedings of the 32nd international conference on machine learning (ICML), pp 1209–1217

  6. Xiong T, Bao YK, Hu ZY (2014) Multiple-output support vector regression with a firefly algorithm for interval-valued stock price index forecasting. Knowl-Based Syst 55:87–100

    Article  Google Scholar 

  7. Hadavandi E, Shahrabi J, Shamshirband S (2015) A novel Boosted-neural network ensemble for modeling multi-target regression problems. Eng Appl Artif Intel 45:204–219

    Article  Google Scholar 

  8. Stojanova D, Ceci M, Appice A, Dzeroski S (2012) Network regression with predictive clustering trees. Data Min Knowl Disc 25(2):378–413

    Article  MathSciNet  Google Scholar 

  9. Yan Y, Ricci E, Subramanian R, Liu GW, Lanz O, Sebe N (2016) A multi-task learning framework for head pose estimation under target motion. IEEE Trans Pattern Anal Mach Intell 38(6):1070–1083

    Article  Google Scholar 

  10. Zhen XT, Wang ZJ, Ali I, Bhaduri M, Chan I, Li S (2016) Multi-scale deep networks and regression forests for direct bi-ventricular volume estimation. Med Image Anal 30:120–129

    Article  Google Scholar 

  11. Wang X, Zhen X, Li Q, Shen D, Huang H (2018) Cognitive assessment prediction in alzheimer’s disease by multi-layer multi-target regression. Neuroinformatics 16:285–294

    Article  Google Scholar 

  12. Zhen XT, Yu MY, He XF, Li S (2018) Multi-target regression via robust low-rank learning. IEEE Trans Pattern Anal Mach Intell 40(2):497–504

    Article  Google Scholar 

  13. Borchani H, Varando G, Bielza C, Larranaga P (2015) A survey on multi-output regression. Wiley Interdiscip Rev Data Min Knowl Discov 5(5):216–233

    Article  Google Scholar 

  14. Lapin M, Hein M, Schiele B (2018) Analysis and optimization of loss functions for multiclass, top-k, and multilabel classification. IEEE Trans Pattern Anal Mach Intell 40(7):1533–1554

    Article  Google Scholar 

  15. Osojnik A, Panov P, Dzeroski S (2017) Multi-label classification via multi-target regression on data streams. Mach Learn 106(6):745–770

    Article  MathSciNet  Google Scholar 

  16. Spyromitros Xioufis E, Tsoumakas G, Groves W, Vlahavas IP (2016) Multi-target regression via input space expansion: treating targets as inputs. Mach Learn 104(1):55–98

    Article  MathSciNet  Google Scholar 

  17. Zhen XT, Yu MY, Zheng F, Ben Nachum I, Bhaduri M, Laidley DT, Li S (2018) Multitarget sparse latent regression. IEEE Trans Neural Netw Learning Syst 29(5):1575–1586

    Article  MathSciNet  Google Scholar 

  18. Melki G, Cano A, Kecman V, Ventura S (2017) Multi-target support vector regression via correlation regressor chains. Inform Sci 415:53–69

    Article  MathSciNet  Google Scholar 

  19. Read J, Hollmen J (2014) A deep interpretation of classifier chains. Adv Intell Data Anal, 251–262

  20. Tsoumakas G, Vlahavas IP (2007) Random k-Labelsets: an ensemble method for multilabel classification. European Conference on Machine Learning, 406–417

  21. Tsoumakas G, Spyromitros Xioufis E, Vrekou A, Vlahavas IP (2014) Multi-target regression via random linear target combinations. European Conference Machine Learning and Knowledge Discovery in Databases, 225–240

  22. Zhang Z, Gu J (2020) Facial affect recognition in the wild using multi-task learning convolutional network. Computer Vision and Pattern Recognition. arXiv:2002.00606

  23. Su F, Shang HY, Wang JY (2019) Low-rank deep convolutional neural network for multi-task learning. Comput Intell Neurosci 2019:1–10

    Article  Google Scholar 

  24. Rai P, Kumar A, Daume H (2012) Simultaneously leveraging output and task structures for multiple-output regression. Advances in Neural Information Processing Systems, 3194–3202

  25. Zhang Y, Yeung DY (2012) A convex formulation for learning task relationships in multi-task learning. arXiv:1203.3536

  26. Alvarez MA, Rosasco L, Lawrence ND (2011) Kernels for vector-valued functions: a review. Found Trends Mach Learn 4(3):195–266

    Article  Google Scholar 

  27. Aho T, Zenko B, Dzeroski S, Elomaa T (2012) Multi-target regression with rule ensembles. J Mach Learn Res 13(1):2367–2407

    MathSciNet  MATH  Google Scholar 

  28. Osojnik A, Panov P, Dzeroski S (2018) Tree-based methods for online multi-target regression. J Intell Inform Syst 50:315–339

    Article  Google Scholar 

  29. Levatic J, Ceci M, Kocev D, Dzeroski S (2017) Self-training for multi-target regression with tree ensembles. Knowl-Based Syst 123:41–60

    Article  Google Scholar 

  30. Stepisnik T, Osojnik A, Dzeroski S, Kocev D (2020) Option predictive clustering trees for multi-target regression. Comput Sci Inform Syst 17:6–6

    Google Scholar 

  31. Mastelini SM, Da Costa VGT, Santana EJ, Nakaro FK, Guido RC, Cerri R, Barbon S Jr (2019) Multi-output tree chaining: an interpretative modelling and lightweight multi-target approach. J Signal Process Syst 91:191–215

    Article  Google Scholar 

  32. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (Grant No.61806033), and Natural Science Foundation of Chongqing (Grant No. cstc2019jcyj-msxmX0021)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kaiwei Sun.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, K., Deng, M., Li, H. et al. Learning local instance correlations for multi-target regression. Appl Intell 51, 6124–6135 (2021). https://doi.org/10.1007/s10489-020-02112-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-02112-5

Keywords

Navigation