当前位置: X-MOL 学术Swarm Evol. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Shallow and deep neural network training by water wave optimization
Swarm and Evolutionary Computation ( IF 8.2 ) Pub Date : 2019-08-07 , DOI: 10.1016/j.swevo.2019.100561
Xiao-Han Zhou , Min-Xia Zhang , Zhi-Ge Xu , Ci-Yun Cai , Yu-Jiao Huang , Yu-Jun Zheng

It is well known that the performance of artificial neural networks (ANNs) is significantly affected by their structure design and parameter selection, for which traditional training methods have drawbacks such as long training times, over-fitting, and premature convergence. Evolutionary algorithms (EAs) have provided an effective tool for ANN parameter optimization. However, simultaneously optimizing ANN structures and parameters remains a difficult problem. In this study, we adapt water wave optimization (WWO), a relatively new EA, for optimizing both the parameters and structures of ANNs, including classical shallow ANNs and deep neural networks (DNNs). We use a variable-dimensional solution encoding to represent both the structure and parameters of an ANN, and adapt WWO propagation, refraction, and breaking operators to efficiently evolve variable-dimensional solutions to solve the complex network optimization problems. Computational experiments on a variety of benchmark datasets show that the WWO algorithm achieves a very competitive performance compared to other popular gradient-based algorithms and EAs.

中文翻译:

通过水波优化进行浅层和深层神经网络训练

众所周知,人工神经网络(ANN)的性能很大程度上受到其结构设计和参数选择的影响,传统的训练方法存在训练时间长、过拟合、早熟收敛等缺点。进化算法(EA)为人工神经网络参数优化提供了有效的工具。然而,同时优化 ANN 结构和参数仍然是一个难题。在本研究中,我们采用水波优化(WWO)这种相对较新的 EA 来优化 ANN 的参数和结构,包括经典的浅层 ANN 和深度神经网络(DNN)。我们使用可变维解编码来表示 ANN 的结构和参数,并采用 WWO 传播、折射和破坏算子来有效地演化可变维解来解决复杂的网络优化问题。在各种基准数据集上的计算实验表明,与其他流行的基于梯度的算法和 EA 相比,WWO 算法实现了非常有竞争力的性能。
更新日期:2019-08-07
down
wechat
bug