Abstract
As a research hotspot in the field of machine learning, ensemble learning improved the prediction accuracy of the final model by constructing and combining multiple basic models. In recent years, many experts and scholars are committed to combining deep networks with ensemble learning to improve the accuracy of neural network models in various scenarios and tasks. But not all neural networks are suitable for participating in the construction of ensemble models. Deep networks with ensemble learning require that the single neural network involved in the integration has high accuracy and great discrepancy with other networks. In the initial stage of deep networks with ensemble learning, the process of generating sets of candidate deep networks is first required. After studying an existing multiobjective deep belief networks ensemble (MODBNE) method, the Gaussian random field model is used as a pre-screening strategy in the process of generating the candidate deep network sets. Individuals with great potential for improvement are selected for fitness function evaluation so that a large number of neural network models with higher accuracy and the larger discrepancy between networks can be easily obtained, which effectively improves the quality of the solution and reduces the time consumed in training the neural networks.
Similar content being viewed by others
References
Dietterich TG (2000) Ensemble methods in machine learning. In: International workshop on multiple classifier systems. Springer, Berlin.
Lecun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436.
Jiang, D., et al. (2020). An energy-efficient networking approach in cloud services for IIoT networks. IEEE Journal on Selected Areas in Communications, 38(5), 928–941.
Jiang, D., Wang, Y., Lv, Z., Qi, S., & Singh, S. (2020). Big data analysis based network behavior insight of cellular networks for industry 4.0 applications. IEEE Transactions on Industrial Informatics, 16(2), 1310–1320.
He, K., Zhang, X., Ren S., and Sun, J., (2016) Deep residual learning for image recognition. In: IEEE Conference on computer vision & pattern recognition.
Ronneberger, O., Fischer, P., & Brox, T. (2015). U-Net: Convolutional networks for biomedical image segmentation. Springer.
Girshick, R (2015) Fast R-Cnn. Computer science.
Kim, J., Lee, J. K., & Lee, K. M. (2016) Accurate image super-resolution using very deep convolutional networks. In: IEEE Conference on computer vision & pattern recognition.
Peimankar, A., & Puthusserypady, S. (2019) An ensemble of deep recurrent neural networks for P-wave detection in electrocardiogram. In: ICASSP 2019 - 2019 IEEE International conference on acoustics, speech and signal processing (ICASSP) IEEE.
Yu, J. (2019). A selective deep stacked denoising autoencoders ensemble with negative correlation learning for gearbox fault diagnosis. Computers in Industry, 108, 62–72.
Li, J., Wu, S., Liu, C., Yu, Z., & Wong, H. S. (2020). Semi-supervised deep coupled ensemble learning with classification landmark exploration. IEEE Transactions on Image Processing, 29, 538–550.
Ning, J., et al. (2019). A computer-aided detection system for the detection of lung nodules based on 3D-ResNet. Applied Sciences, 9(24), 5544.
Jiang, W., Chen, Z., Xiang, Y., Shao, D., & Zhang, J. (2019). SSEM: A novel self-adaptive stacking ensemble model for classification. IEEE Access, 7, 120337–120349.
Zhang, C., Lim, P., Qin, A. K., & Tan, K. C. (2017). Multiobjective deep belief networks ensemble for remaining useful life estimation in prognostics. IEEE Transactions on Neural Networks and Learning Systems, 28(10), 2306–2318.
Ali, I. M., Essam, D., & Kasmarik, K. (2020). A novel design of differential evolution for solving discrete traveling salesman problems. Swarm and Evolutionary Computation, 52, 100607.
Pan, L., He, C., Tian, Y., Wang, H., Zhang, X., & Jin, Y. (2019). A classification-based surrogate-assisted evolutionary algorithm for expensive many-objective optimization. IEEE Transactions on Evolutionary Computation, 23(1), 74–88.
Zhang, Q., & Li, H. (2007). MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, 11(6), 712–731.
Liu, Y., Yao, X., & Higuchi, T. (2000). Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation, 4(4), 380–387.
Emmerich, M. T. M., Giannakoglou, K. C., & Naujoks, B. (2006). Single- and multiobjective evolutionary optimization assisted by gaussian random field metamodels. IEEE Transactions on Evolutionary Computation, 10(4), 421–439.
Xiao, H., Rasul, K., & Vollgraf, R. (2017) Fashion-MNIST: A novel image dataset for benchmarking machine learning algorithms. arXiv e-prints.
Acknowledgements
This work was supported by the Key Project of National Natural Science Foundation of China (U1908212) and the Fundamental Research Funds for the Central Universities (N2017013, N2017014).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhang, C., Dai, Z., Liang, X. et al. An evolutionary generation method of deep neural network sets combined with Gaussian random field. Wireless Netw (2021). https://doi.org/10.1007/s11276-021-02677-0
Accepted:
Published:
DOI: https://doi.org/10.1007/s11276-021-02677-0