Skip to main content
Log in

Time Series Forecasting Using Differential Evolution-Based ANN Modelling Scheme

  • Research Article-Computer Engineering and Computer Science
  • Published:
Arabian Journal for Science and Engineering Aims and scope Submit manuscript

Abstract

Over the past few decades, time series forecasting (TSF) has been predominantly performed using different artificial neural network (ANN) models. However, the performance of ANN models in TSF has not yet been fully explored due to several issues like the determination of near-optimal ANN architecture for a time series and the efficiency of training algorithm used to determine the near-optimal weights of ANN. Motivated by this, we have proposed an adaptive differential evolution (DE)-based modelling scheme to automatically determine the near-optimal architecture of ANN for a time series under study. Additionally, we have proposed an adaptive differential evolution-based ANN training algorithm (ADE-ANNT) to determine the near-optimal weights of ANN. To make the adaptive modelling scheme consistently effective, several comparisons are made between different alternatives in the treatment of trend component and normalization techniques. Twenty-one benchmark time series datasets are being considered to assess the comparative performance of the proposed method with the established forecasting models, namely autoregressive integrated moving average, exponential smoothening with error, trend and seasonality, deep belief network and multilayer perceptron + Levenberg–Marquardt (LM) method. To assess the efficiency of the proposed ADE-ANNT training algorithm, comparisons are made with the ANN training algorithms based on recently developed evolutionary algorithms, such as TLBO-ANNT, DE-CRO-HONNT and DE-ANNT+; and the most popular LM training algorithm. Extensive statistical analysis on simulation results reveal the statistical superiority of the proposed training algorithm and proposed method when compared with their counterparts for the datasets used.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Makridakis, S.G.; Steven, C.; Wheelwright, R.J.H.: Forecasting Methods and Applications. Wiley, New York (2008)

    MATH  Google Scholar 

  2. de Oliveira, J.F.L.; Ludermir, T.B.: A hybrid evolutionary decomposition system for time series forecasting. Neurocomputing 180, 27–34 (2016). https://doi.org/10.1016/j.neucom.2015.07.113

    Article  Google Scholar 

  3. Zhang, G.; Eddy Patuwo, B.; Hu, Y.M.: Forecasting with artificial neural networks: the state of the art. Int. J. Forecast. 14, 35–62 (1998). https://doi.org/10.1016/S0169-2070(97)00044-7

    Article  Google Scholar 

  4. Enders, W.: Applied Econometric Time Series. Wiley, New York (2014)

    Google Scholar 

  5. Wei, W.W.S.: Time Series Analysis. Oxford University Press, Oxford (2013)

    Book  Google Scholar 

  6. Zhang, G.P.: Neural networks for time-series forecasting. In: Rozenberg, G., Bäck, T., Kok, J.N. (eds.) Handbook of Natural Computing, pp. 461–477. Springer, Berlin, Heidelberg (2012). https://doi.org/10.1007/978-3-540-92910-9_14

    Chapter  Google Scholar 

  7. Swanson, N.R.; White, H.: Forecasting economic time series using flexible versus fixed specification and linear versus nonlinear econometric models. Int. J. Forecast. 13, 439–461 (1997). https://doi.org/10.1016/S0169-2070(97)00030-7

    Article  Google Scholar 

  8. Hill, T.; O’Connor, M.; Remus, W.: Neural network models for time series forecasts. Manag. Sci. 42, 1082–1092 (1996). https://doi.org/10.1287/mnsc.42.7.1082

    Article  MATH  Google Scholar 

  9. Heravi, S.; Osborn, D.R.; Birchenhall, C.R.: Linear versus neural network forecasts for European industrial production series. Int. J. Forecast. 20, 435–446 (2004). https://doi.org/10.1016/S0169-2070(03)00062-1

    Article  Google Scholar 

  10. Callen, J.L.; Kwan, C.C.; Yip, P.C.; Yuan, Y.: Neural network forecasting of quarterly accounting earnings. Int. J. Forecast. 12, 475–482 (1996). https://doi.org/10.1016/S0169-2070(96)00706-6

    Article  Google Scholar 

  11. Panigrahi, S.; Behera, H.S.: Effect of normalization techniques on univariate time series forecasting using evolutionary higher order neural network. Int. J. Eng. Adv. Technol. 3, 280–285 (2013)

    Google Scholar 

  12. Min, Q.; Zhang, G.P.: Trend time-series modeling and forecasting with neural networks. IEEE Trans. Neural Netw. 19, 808–816 (2008). https://doi.org/10.1109/TNN.2007.912308

    Article  Google Scholar 

  13. Crone, S.F.; Hibon, M.; Nikolopoulos, K.: Advances in forecasting with neural networks? Empirical evidence from the NN3 competition on time series prediction. Int. J. Forecast. 27, 635–660 (2011). https://doi.org/10.1016/j.ijforecast.2011.04.001

    Article  Google Scholar 

  14. Panigrahi, S.; Karali, Y.; Behera, S.H.: Time series forecasting using evolutionary neural network. Int. J. Comput. Appl. 75, 13–17 (2013). https://doi.org/10.5120/13146-0553

    Article  Google Scholar 

  15. Yang, Z.; Mourshed, M.; Liu, K.; Xu, X.; Feng, S.: A novel competitive swarm optimized RBF neural network model for short-term solar power generation forecasting. Neurocomputing. 397, 415–421 (2020). https://doi.org/10.1016/j.neucom.2019.09.110

    Article  Google Scholar 

  16. Xiangxue, W.; Lunhui, X.; Kaixun, C.: Data-driven short-term forecasting for urban road network traffic based on data processing and LSTM-RNN. Arab. J. Sci. Eng. 44, 3043–3060 (2019). https://doi.org/10.1007/s13369-018-3390-0

    Article  Google Scholar 

  17. Poorzaker Arabani, S.; Ebrahimpour Komleh, H.: The improvement of forecasting ATMS cash demand of Iran banking network using convolutional neural network. Arab. J. Sci. Eng. 44, 3733–3743 (2019). https://doi.org/10.1007/s13369-018-3647-7

    Article  Google Scholar 

  18. Panigrahi, S.; Behera, H.S.: Nonlinear time series forecasting using a novel self-adaptive TLBO-MFLANN model. Int. J. Comput. Intell. Stud. 8, 4 (2019). https://doi.org/10.1504/IJCISTUDIES.2019.10019170

    Article  Google Scholar 

  19. Minku, F.L.; Ludermir, T.B.: Clustering and co-evolution to construct neural network ensembles: an experimental study. Neural Netw. 21, 1363–1379 (2008). https://doi.org/10.1016/j.neunet.2008.02.001

    Article  Google Scholar 

  20. Stepniewski, S.W.; Keane, A.J.: Pruning backpropagation neural networks using modern stochastic optimization techniques. Neural Comput. Appl. 5, 76–98 (1997). https://doi.org/10.1007/BF01501173

    Article  Google Scholar 

  21. Miller, G.F.; Todd, P.M.; Hegde, S.U.: Designing Neural Networks using Genetic Algorithms. In: Proceedings of the 3rd International Conference on Genetic Algorithms, George Mason University, Fairfax, Virginia, USA, pp. 379–384 (1989)

  22. Kitano, H.: Designing neural networks using genetic algorithms with graph generation system. Complex Syst. 4, 461–476 (1990)

    MATH  Google Scholar 

  23. Gruau, F.: Genetic synthesis of Boolean neural networks with a cell rewriting developmental process. In: [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks, pp. 55–74. IEEE Computer Society Press (1992)

  24. Donate, J.P.; Li, X.; Sánchez, G.G.; de Miguel, A.S.: Time series forecasting by evolving artificial neural networks with genetic algorithms, differential evolution and estimation of distribution algorithm. Neural Comput. Appl. 22, 11–20 (2013). https://doi.org/10.1007/s00521-011-0741-0

    Article  Google Scholar 

  25. Saboo, A., Sharma, A., Dash, T.: GASOM: Genetic Algorithm Assisted Architecture Learning in Self Organizing Maps. Presented at the (2017)

  26. Sun, Y.; Xue, B.; Zhang, M.; Yen, G.G.; Lv, J.: Automatically designing CNN architectures using the genetic algorithm for image classification. IEEE Trans. Cybern. 10, 1–15 (2020). https://doi.org/10.1109/TCYB.2020.2983860

    Article  Google Scholar 

  27. Kuremoto, T.; Kimura, S.; Kobayashi, K.; Obayashi, M.: Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing 137, 47–56 (2014). https://doi.org/10.1016/j.neucom.2013.03.047

    Article  Google Scholar 

  28. Ewees, A.A.; Elaziz, M.A.; Alameer, Z.; Ye, H.; Jianhua, Z.: Improving multilayer perceptron neural network using chaotic grasshopper optimization algorithm to forecast iron ore price volatility. Resour. Policy 65, 101555 (2020). https://doi.org/10.1016/j.resourpol.2019.101555

    Article  Google Scholar 

  29. Baffes, P.T., Zelle, J.M.: Growing layers of perceptrons: introducing the Extentron algorithm. In: [Proceedings 1992] IJCNN International Joint Conference on Neural Networks. pp. 392–397. IEEE

  30. Gutierrez, G.; Sanchis, A.; Isasi, P.; Molina, J.M.; Galvan, I.M.: Non-direct encoding method based on cellular automata to design neural network architectures. Comput. Inf. 24, 225–247 (2005)

    MATH  Google Scholar 

  31. Yao, Xin; Liu, Yong: Making use of population information in evolutionary artificial neural networks. IEEE Trans. Syst. Man Cybern. Part B 28, 417–425 (1998). https://doi.org/10.1109/3477.678637

    Article  Google Scholar 

  32. Peralta, J., Gutierrez, G., Sanchis, A.: ADANN: Automatic design of artificial neural networks. In: Proceedings of the 2008 GECCO Conference Companion on Genetic and Evolutionary Computation—GECCO’08, pp. 1863–1870. ACM Press, New York (2008)

  33. Peralta, J., Gutierrez, G., Sanchis, A.: Time series forecasting by evolving artificial neural networks using genetic algorithms and estimation of distribution algorithms. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2010)

  34. Xu, X., Li, Y.: Comparison between particle swarm optimization, differential evolution and multi-parents crossover. In: 2007 International Conference on Computational Intelligence and Security (CIS 2007), pp. 124–127. IEEE (2007)

  35. Ash, T.: Dynamic node creation in backpropagation networks. In: International Joint Conference on Neural Networks, vol. 2, p 623. IEEE (1989)

  36. Balkin, S.D.; Ord, J.K.: Automatic neural network modeling for univariate time series. Int. J. Forecast. 16, 509–515 (2000). https://doi.org/10.1016/S0169-2070(00)00072-8

    Article  Google Scholar 

  37. Makridakis, S.; Spiliotis, E.; Assimakopoulos, V.: Statistical and machine learning forecasting methods: concerns and ways forward. PLoS ONE 13, e0194889 (2018). https://doi.org/10.1371/journal.pone.0194889

    Article  Google Scholar 

  38. Choi, T.J., Cheong, Y.-G., Ahn, CW: A Performance Comparison of Crossover Variations in Differential Evolution for Training Multi-layer Perceptron Neural Networks. Presented at the (2018)

  39. Pattanayak, R.M., Behera, H.S., Panigrahi, S.: A Novel Hybrid Differential Evolution-PSNN for Fuzzy Time Series Forecasting. Presented at the (2020)

  40. Slowik, A., Bialko, M.: Training of artificial neural networks using differential evolution algorithm. In: 2008 Conference on Human System Interactions, pp. 60–65. IEEE (2008)

  41. Slowik, A.: Application of an adaptive differential evolution algorithm with multiple trial vectors to artificial neural network training. IEEE Trans. Ind. Electron. 58, 3160–3167 (2011). https://doi.org/10.1109/TIE.2010.2062474

    Article  Google Scholar 

  42. Sahu, K.K.; Panigrahi, S.; Behera, H.S.: A novel chemical reaction optimization algorithm for higher order neural network training. J. Theor. Appl. Inf. Technol. 53, 402–409 (2013)

    Google Scholar 

  43. Panigrahi, S.: A novel hybrid chemical reaction optimization algorithm with adaptive differential evolution mutation strategies for higher order neural network training. Int. Arab J. Inf. Technol. 14, 18–25 (2017)

    Google Scholar 

  44. Karali, Y.; Panigrahi, S.: Behera, HS: A novel differential evolution based algorithm for higher order neural network training. J. Theor. Appl. Inf. Technol. 56, 355–361 (2013)

    Google Scholar 

  45. Storn, R.; Price, K.: Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11, 341–359 (1997). https://doi.org/10.1023/A:1008202821328

    Article  MathSciNet  MATH  Google Scholar 

  46. Awad, N.H., Ali, M.Z., Suganthan, P.N., Reynolds, R.G.: An ensemble sinusoidal parameter adaptation incorporated with L-SHADE for solving CEC2014 benchmark problems. In: 2016 IEEE Congress on Evolutionary Computation (CEC), pp. 2958–2965. IEEE (2016)

  47. Awad, N.H., Ali, M.Z., Suganthan, P.N.: Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problems. In: 2017 IEEE Congress on Evolutionary Computation (CEC). pp. 372–379. IEEE (2017)

  48. Akhmedova, S., Stanovov, V., Semenkin, E.: LSHADE algorithm with a rank-based selective pressure strategy for the circular antenna array design problem. In: Proceedings of the 15th International Conference on Informatics in Control, Automation and Robotics. pp. 159–165. SCITEPRESS—Science and Technology Publications (2018)

  49. Pant, M.B.; Zaheer, H.; Garcia-Hernandez, L.; Abraham, A.: Differential evolution: a review of more than two decades of research. Eng. Appl. Artif. Intell. 90, 103479 (2020). https://doi.org/10.1016/j.engappai.2020.103479

    Article  Google Scholar 

  50. Das, S.; Mullick, S.S.; Suganthan, P.N.: Recent advances in differential evolution—an updated survey. Swarm Evol. Comput. 27, 1–30 (2016). https://doi.org/10.1016/j.swevo.2016.01.004

    Article  Google Scholar 

  51. Price, K.; Storn, R.M.; Lampinen, J.A.: Differential Evolution: A practical approach to global optimization. Springer, Berlin (2005)

    MATH  Google Scholar 

  52. Price, K.V.: An introduction to differential evolution. In: Corne, D., Dorigo, M., Glover, F., Dasgupta, D., Moscato, P., Poli, R., Price, K.V. (eds.) New Ideas in Optimization, pp. 79–108 (1999)

  53. Grippo, L.: A class of unconstrained minimization methods for neural network training. Optim. Methods Softw. 4, 135–150 (1994). https://doi.org/10.1080/10556789408805583

    Article  Google Scholar 

  54. Jacobs, R.A.: Increased rates of convergence through learning rate adaptation. Neural Networks. 1, 295–307 (1988). https://doi.org/10.1016/0893-6080(88)90003-2

    Article  Google Scholar 

  55. Gori, M.; Tesi, A.: On the problem of local minima in backpropagation. IEEE Trans. Pattern Anal. Mach. Intell. 14, 76–86 (1992). https://doi.org/10.1109/34.107014

    Article  Google Scholar 

  56. Mezura-Montes, E., Velázquez-Reyes, J., Coello Coello, C.A.: A comparative study of differential evolution variants for global optimization. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation—GECCO’06, p. 485. ACM Press, New York (2006)

  57. Weber, M.; Tirronen, V.; Neri, F.: Scale factor inheritance mechanism in distributed differential evolution. Soft. Comput. 14, 1187–1207 (2010). https://doi.org/10.1007/s00500-009-0510-5

    Article  Google Scholar 

  58. Islam, S.M.; Das, S.; Ghosh, S.; Roy, S.; Suganthan, P.N.: An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization. IEEE Trans. Syst. Man Cybern. Part B 42, 482–500 (2012). https://doi.org/10.1109/TSMCB.2011.2167966

    Article  Google Scholar 

  59. Hyndman, R.J.; Khandakar, Y.: Automatic time series forecasting: the forecast package for R. J. Stat. Softw. (2008). https://doi.org/10.18637/jss.v027.i03

    Article  Google Scholar 

  60. Panigrahi, S.; Behera, H.S.: A hybrid ETS–ANN model for time series forecasting. Eng. Appl. Artif. Intell. 66, 49–59 (2017). https://doi.org/10.1016/j.engappai.2017.07.007

    Article  Google Scholar 

  61. Hyndman, R.: YY: tsdl: Time Series Data Library. v0.1.0., https://pkg.yangzhuoranyang.com/tsdl/articles/tsdl.html. Accessed 2 June 2019

  62. Hollander, M.; Wolfe, A.D.; Chicken, E.: Nonparametric Statistical Methods. Wiley, New York (2015)

    Book  Google Scholar 

  63. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  64. Rao, R.V.; Savsani, V.J.; Vakharia, D.P.: Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput. Des. 43, 303–315 (2011). https://doi.org/10.1016/j.cad.2010.12.015

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to H. S. Behera.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Panigrahi, S., Behera, H.S. Time Series Forecasting Using Differential Evolution-Based ANN Modelling Scheme. Arab J Sci Eng 45, 11129–11146 (2020). https://doi.org/10.1007/s13369-020-05004-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13369-020-05004-5

Keywords

Navigation