当前位置: X-MOL 学术Cogn. Syst. Res. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
PSO-GA based hybrid with Adam Optimization for ANN training with application in Medical Diagnosis
Cognitive Systems Research ( IF 3.9 ) Pub Date : 2020-12-01 , DOI: 10.1016/j.cogsys.2020.08.011
Rajesh K. Yadav , Anubhav

Abstract This paper introduces a novel PSO-GA based hybrid training algorithm with Adam Optimization and contrasts performance with the generic Gradient Descent based Backpropagation algorithm with Adam Optimization for training Artificial Neural Networks. We aim to overcome the shortcomings of the traditional algorithm, such as slower convergence rate and frequent convergence to local minima, by employing the characteristics of evolutionary algorithms. PSO has a property of faster convergence rate, which can be exploited to account for the slower pace of convergence of the traditional BP (which is due to low values of gradients). In contrast, the integration with GA complements the drawback of convergence to local minima as GA, possesses the capability of efficient global search. So by this integration of these algorithms, we propose our new hybrid algorithm for training ANNs. We compare both the algorithms for the application of medical diagnosis. Results display that the proposed hybrid training algorithm, significantly outperforms the traditional training algorithm, by enhancing the accuracies of the ANNs with an increase of 20% in the average testing accuracy and 0.7% increase in the best testing accuracy.

中文翻译:

基于 PSO-GA 的混合与 Adam 优化用于 ANN 训练并在医学诊断中的应用

摘要 本文介绍了一种新的基于 PSO-GA 的混合训练算法和 Adam 优化,并将性能与基于 Adam 优化的通用梯度下降反向传播算法进行了对比,用于训练人工神经网络。我们旨在通过利用进化算法的特点来克服传统算法的缺点,例如收敛速度较慢和频繁收敛到局部最小值。PSO 具有收敛速度较快的特性,可以利用这一特性来解释传统 BP 收敛速度较慢的问题(这是由于梯度值较低)。相比之下,与遗传算法的集成弥补了遗传算法收敛到局部最小值的缺点,具有高效全局搜索的能力。所以通过这些算法的整合,我们提出了用于训练人工神经网络的新混合算法。我们比较了这两种算法在医学诊断中的应用。结果表明,所提出的混合训练算法显着优于传统训练算法,通过提高人工神经网络的准确度,平均测试准确度提高了 20%,最佳测试准确度提高了 0.7%。
更新日期:2020-12-01
down
wechat
bug