Skip to main content
Log in

Research on a dynamic full Bayesian classifier for time-series data with insufficient information

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Small sample time-series data with insufficient information are ubiquitous. It is challenging to improve the classification reliability of small sample time-series data. At present, the dynamic classifications for small sample time-series data still lack a tailored method. To address this, we first setup the architecture of dynamic Bayesian derivative classifiers, and then establish a dynamic full Bayesian classifier for small sample time-series data. The joint density of attributes is estimated by using multivariate Gaussian kernel function with smoothing parameters. The dynamic full Bayesian classifier is optimized by splitting the smooth parameters into intervals, optimizing the parameters by constructing a smoothing parameter configuration tree (or forest), then selecting and averaging the classifiers. The dynamic full Bayesian classifier is applied to forecast turning points. Experimental results show that the resultant classifier developed in this paper is more accurate when compared with other nine commonly used classifiers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Vapnik V, Vapnik V, Vapnik VN (2003) Statistical learning theory. Ann Inst Stat Math 55(2):371–389

    MathSciNet  MATH  Google Scholar 

  2. Chauvin Y, Rumelhart DE (1995) Backpropagation: theory, structures, and applications. L. Erlbaum Associates Inc

  3. Breiman LJ, Friedman CJ, Olshen RA. Classification and Regression Tree. Chapman and Hall/CRC, Boca Raton, FL:1984

  4. Hernandez-Matamoros A et al (2020) Forecasting of COVID19 per regions using ARIMA models and polynomial functions. Applied Soft Computing 96:106610

    Article  Google Scholar 

  5. Hongsakulvasu N, Khiewngamdee C, Liammukda A (2020) Does COVID-19 crisis affects the spillover of oil Market's return and risk on Thailand's Sectoral stock return?: evidence from bivariate DCC GARCH-in-mean model. International Energy Journal 20(4):647–662

    Google Scholar 

  6. Wang S, Zhang S et al (2020) FMDBN: A first-order Markov dynamic Bayesian network classifier with continuous attributes. Knowledge-Based Systems 195:105638

    Article  Google Scholar 

  7. Johansson A, Guillemette Y, Murtin F et al Looking to 2060: Long-term global growth prospects. Oecd Economic Policy Papers 8.4(2012):330–338

  8. Naraidoo R, Paya I (2012) Forecasting monetary policy rules in South Africa. Int J Forecast 28(2):446–455

    Article  Google Scholar 

  9. Duda RO, Hart PE. Pattern classification and scene analysis. Vol. 3. Wiley, New York: 1973

  10. Chow C, Liu C (1968) Approximating discrete probability distributions with dependence trees. IEEE Trans Inf Theory 14(3):462–467

    Article  Google Scholar 

  11. Friedman N, Geiger D, Goldszmidt M (1997) Bayesian network classifiers. Mach Learn 29(2–3):131–163

    Article  Google Scholar 

  12. Domingos P, Pazzani M (1997) On the optimality of the simple Bayesian classifier under zero-one loss. Mach Learn 29(2–3):103–130

    Article  Google Scholar 

  13. de Campos CP, Corani G, Scanagatta M, Cuccu M, Zaffalon M (2016) Learning extended tree augmented naive structures. Int J Approx Reason 68:153–163

    Article  MathSciNet  Google Scholar 

  14. Cheng J, Greiner R (1999) "Comparing Bayesian network classifiers." Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence Morgan Kaufmann Publishers Inc: 101–108

  15. Petitjean F, Buntine W, Webb GI, Zaidi N (2018) Accurate parameter estimation for Bayesian network classifiers using hierarchical Dirichlet processes. Mach Learn 107(8–10):1303–1331

    Article  MathSciNet  Google Scholar 

  16. Yager RR (2006) An extension of the naive Bayesian classifier. Inf Sci 176(5):577–588

    Article  MathSciNet  Google Scholar 

  17. Webb GI, Boughton JR, Wang Z (2005) Not so naive Bayes: aggregating one-dependence estimators. Mach Learn 58(1):5–24

    Article  Google Scholar 

  18. Flores MJ, Gámez JA, Martínez AM (2014) Domains of competence of the semi-naive Bayesian network classifiers. Inf Sci 260:120–148

    Article  MathSciNet  Google Scholar 

  19. Berend D, Kontorovich A (2015) A finite sample analysis of the naive Bayes classifier. J Mach Learn Res 16(44):1519–1545

    MathSciNet  MATH  Google Scholar 

  20. Wang SC, Xu GL, RuiJie D (2013) Restricted Bayesian classification networks. SCIENCE CHINA Inf Sci 56(7):1–15

    MathSciNet  MATH  Google Scholar 

  21. Sathyaraj R, Prabu S (2018) A hybrid approach to improve the quality of software fault prediction using Naïve Bayes and k-NN classification algorithm with ensemble method. Int J Intell Syst Technol Appl 17(4):483

    Google Scholar 

  22. Yang Y, Ding M (2019) Decision function with probability feature weighting based on Bayesian network for multi-label classification. Neural Comput & Applic 31(9):4819–4828

    Article  Google Scholar 

  23. Kuang L, Yan H, Zhu Y, Tu S, Fan X (2019) Predicting duration of traffic accidents based on cost-sensitive Bayesian network and weighted K-nearest neighbor. J Intell Transp Syst 23(2):161–174

    Article  Google Scholar 

  24. Pérez A, Larrañaga P, and Inza I (2006) "Supervised classification with conditional Gaussian networks: Increasing the structure complexity from naive Bayes." International Journal of Approximate Reasoning 43.1: 1–25

  25. Pérez A, Larrañaga P, Inza I (2009) Bayesian classifiers based on kernel density estimation: flexible classifiers. Int J Approx Reason 50(2):341–362

    Article  Google Scholar 

  26. Salinas-Gutiérrez R, Aguirre AH, Rivera-Meraz MJJ et al. (2010) "Supervised Probabilistic Classification Based on Gaussian Copulas. " Mexican International Conference on Artificial Intelligence, Advances in Soft Computing: 104–115

  27. Scott DW (2015) Multivariate density estimation: theory, practice, and visualization. John Wiley & Sons, New Jersey

  28. John GH, Langley P (1995) "Estimating Continuous Distributions in Bayesian Classifiers. " In Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence (San Mateo, USA, 1995), 338–345

  29. He Y-L, Wang R, Kwong S, Wang X-Z (2014) Bayesian classifiers based on probability density estimation and their applications to simultaneous fault diagnosis. Inf Sci 259:252–268

    Article  MathSciNet  Google Scholar 

  30. Gutiérrez L, Gutiérrez-Peña E, Mena RH (2014) Bayesian nonparametric classification for spectroscopy data. Computational Statistics & Data Analysis 78:56–68

    Article  MathSciNet  Google Scholar 

  31. Zhang W, Zhang Z, Chao H-C, Tseng F-H (2018) Kernel mixture model for probability density estimation in Bayesian classifiers. Data Min Knowl Disc 32(3):675–707

    Article  MathSciNet  Google Scholar 

  32. Xiang Z-L, Yu X-R, Kang D-K (2016) Experimental analysis of naïve Bayes classifier based on an attribute weighting framework with smooth kernel density estimations. Appl Intell 44(3):611–620

    Article  Google Scholar 

  33. Wang S-c, Gao R, Wang L-m (2016) Bayesian network classifiers based on Gaussian kernel density. Expert Syst Appl 51:207–217

    Article  Google Scholar 

  34. Dang Q, Gao F, Zhou Y (2016) Early detection method for emerging topics based on dynamic bayesian networks in micro-blogging networks. Expert Syst Appl 57:285–295

    Article  Google Scholar 

  35. Xiao Q, Chaoqin C, Li Z (2017) Time series prediction using dynamic Bayesian network. Optik 135:98–103

    Article  Google Scholar 

  36. Premebida C, Faria DR, Nunes U (2017) Dynamic bayesian network for semantic place classification in mobile robotics. Auton Robot 41(5):1161–1172

    Article  Google Scholar 

  37. Chhabra R, Rama Krishna C, Verma S (2019) Smartphone based context-aware driver behavior classification using dynamic bayesian network. Journal of Intelligent & Fuzzy Systems 36(5):4399–4412

    Article  Google Scholar 

  38. Song C, Xu Z, Zhang Y, et al. (2020) "Dynamic hesitant fuzzy Bayesian network and its application in the optimal investment port decision making problem of "twenty-first century maritime silk road"." Applied Intelligence 2

  39. Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. San Mateo, California

    MATH  Google Scholar 

  40. Murphy SL, Aha DW (2019) UCI repository of machine learning databases. https://archive.ics.uci.edu/ml/datasets.php

  41. Zhang XY, Yin F, Zhang YM et al (2017) Drawing and recognizing Chinese characters with recurrent neural network. IEEE Trans Pattern Anal Mach Intell 40(4):849–862

    Article  Google Scholar 

  42. Wijnands JS, Thompson J, Aschwanden GDPA et al (2018) "Identifying behavioural change among drivers using Long Short-Term Memory recurrent neural networks". Transportation research, Part F. Traffic psychology and behaviour 53.2: 34–49

  43. Kim PS, Lee DG, Lee SW (2018) Discriminative context learning with gated recurrent unit for group activity recognition. Pattern Recogn 76(4):149–161

    Article  MathSciNet  Google Scholar 

  44. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(1):1–30

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Social Science Foundation of China [Grant number 18BTJ020]; the National Natural Science Foundation of China [Grant numbers 71771179, 72021002, 82072228]; the Foundation of National Key R&D Program of China [Grant number 2020YFC2008700]; and the Foundation of Shanghai 5G + Intelligent Medical Innovation Laboratory.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yongrui Duan.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, S., Zhang, S., Wu, T. et al. Research on a dynamic full Bayesian classifier for time-series data with insufficient information. Appl Intell 52, 1059–1075 (2022). https://doi.org/10.1007/s10489-021-02448-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-02448-6

Keywords

Navigation