Skip to main content
Log in

Improved bidirectional extreme learning machine based on enhanced random search

  • Regular Research Paper
  • Published:
Memetic Computing Aims and scope Submit manuscript

Abstract

The incremental extreme learning machine (I-ELM) was proposed in 2006 as a method to improve the network architecture of extreme learning machines (ELMs). To improve on the I-ELM, bidirectional extreme learning machines (B-ELMs) were developed in 2012. The B-ELM uses the same method as the I-ELM but separates the odd and even learning steps. At the odd learning step, a hidden node is added like I-ELM. At the even learning step, a new hidden node is added via a formula based on the former added node result. However, some of the hidden nodes generated by the I-ELM may play a minor role; thus, the increase in network complexity due to the B-ELM may be unnecessary. To avoid this issue, this paper proposes an enhanced B-ELM method (referred to as EB-ELM). Several hidden nodes are randomly generated at each odd learning step, however, only the nodes with the largest residual error reduction will be added to the existing network. Simulation results show that the EB-ELM can obtain higher accuracy and achieve better performance than the B-ELM under the same network architecture. In addition, the EB-ELM can achieve a faster convergence rate than the B-ELM, which means that the EB-ELM has smaller network complexity and faster learning speed than the B-ELM.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Schmidt WF, Kraaijveld MA, Duin RPW (1992) Feedforward neural networks with random weights. In: Proceedings of the 11th IAPR international conference on pattern recognition. doi:10.1109/ICPR.1992.201708

  2. Pao YH, Takefuji Y (1992) Functional-link net computing: theory, system architecture, and functionalities. Computer 25(5):76–79

    Article  Google Scholar 

  3. Igelnik B, Pao YH (1995) Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans Neural Netw 6(6):1320–1329

    Article  Google Scholar 

  4. Zhang L, Suganthan PN (2016) A comprehensive evaluation of random vector functional link networks. Inf Sci 367–368:1094–1105

    Article  Google Scholar 

  5. Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine a new learning scheme of feedforward neural networks. In: Proceedings of 2004 IEEE international joint conference on neural networks, vol 2. IEEE, pp 985–990

  6. Huang G, Huang GB, Song SJ, You KY (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48

    Article  MATH  Google Scholar 

  7. Huang ZY, Yu YL, Gu J, Liu HP (2016) An efficient method for traffic sign recognition based on extreme learning machine. IEEE Trans Cybern 47(4):920–933

    Article  Google Scholar 

  8. Xie ZG, Xu K, Shan W, Liu LG, Xiong YS, Huang H (2015) Projective feature learning for 3D shapes with multi-view depth images. Comput Graph Forum 34(7):1–11

    Article  Google Scholar 

  9. Zhao XG, Ma ZY, Zhang Z (2017) A novel recommendation system in location-based social networks using distributed ELM. Memet Comput. doi:10.1007/s12293-017-0227-4

    Google Scholar 

  10. Zhang N, Ding SF (2017) Unsupervised and semi-supervised extreme learning machine with wavelet kernel for high dimensional data. Memet Comput 9(2):129–139

    Article  Google Scholar 

  11. Das SP, Padhy S (2016) Unsupervised extreme learning machine and support vector regression hybrid model for predicting energy commodity futures index. Memet Comput. doi:10.1007/s12293-016-0191-4

    Google Scholar 

  12. Xiao CX, Dong ZY, Xu Y, Meng K, Zhou X, Zhang X (2016) Rational and self-adaptive evolutionary extreme learning machine for electricity price forecast. Memet Comput 8(3):223–233

    Article  Google Scholar 

  13. Tissera MD, McDonnell MD (2016) Deep extreme learning machines: supervised autoencoding architecture for classification. Neurocomputing 174(22):42–49

    Article  Google Scholar 

  14. Liu HP, Li FX, Xu XY, Sun FC (2017) Active object recognition using hierarchical local-receptive-field-based extreme learning machine. Memet Comput. doi:10.1007/s12293-017-0229-2

    Google Scholar 

  15. Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892

    Article  Google Scholar 

  16. LeCun Y, Boser B, Denker JS, Howard RE, Hubbard W, Jackel LD, Henderson D (1989) Handwritten digit recognition with a back-propagation network. In: Advances in Neural Information Processing Systems, Kaufmann, San Francisco, CA, USA, pp 396–404

  17. Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(s 16–18):3460–3468

    Article  Google Scholar 

  18. Yang YM, Wang YN, Yuan XF (2012) Bidirectional extreme learning machine for regression problem and its learning effectiveness. IEEE Trans Neural Netw Learn Syst 23(9):1498–1505

    Article  Google Scholar 

  19. Rong HJ, Ong YS, Tan AH, Zhu Z (2008) A fast pruned-extreme learning machine for classification problem. Neurocomputing 72(1):359–366

    Article  Google Scholar 

  20. Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A (2010) OPELM: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21(1):158–162

    Article  Google Scholar 

  21. Zhang R, Lan Y, Huang GB, Xu ZB (2012) Universal approximation of extreme learning machine with adaptive growth of hidden nodes. IEEE Trans Neural Netw Learn Syst 23(2):365–371

    Article  Google Scholar 

  22. Feng GR, Huang GB, Lin QP, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1352–1357

    Article  Google Scholar 

  23. Yang YM, Wu JQM (2016) Extreme learning machine with subnetwork hidden nodes for regression and classification. IEEE Trans Cybern 46(12):2570–2583

    Article  Google Scholar 

  24. Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml. University of California, School of Information and Computer Science, Irvine

  25. Tüfekci P (2014) Prediction of full load electrical power output of a base load operated combined cycle power plant using machine learning methods. Int J Electr Power Energy Syst 60:126–140

    Article  Google Scholar 

  26. Yeh IC (1998) Modeling of strength of high performance concrete using artificial neural networks. Cem Concr Res 28(12):1797–1808

    Article  Google Scholar 

  27. Tsanas A, Xifara A (2012) Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools. Energy Build 49:560–567

    Article  Google Scholar 

  28. Cortez P, Cerdeira A, Almeida F, Matos T, Reis J (2009) Modeling wine preferences by data mining from physicochemical properties. Decis Support Syst 47(4):547–553

  29. Coraddu A, Oneto L, Ghio A, Savio S, Anguita D, Figari M (2014) Machine learning approaches for improving condition-based maintenance of naval propulsion plants. J Eng Marit Environ 230(8):136–153

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the editor and reviewers for their invaluable suggestions to improve the quality of this paper. This research is supported by the National Natural Science Foundation of China under Grant No. 61672358.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhong Ming.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (xlsx 65 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cao, W., Ming, Z., Wang, X. et al. Improved bidirectional extreme learning machine based on enhanced random search. Memetic Comp. 11, 19–26 (2019). https://doi.org/10.1007/s12293-017-0238-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12293-017-0238-1

Keywords

Navigation