当前位置: X-MOL 学术IEEE Comput. Intell. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The Evolution of Neural Networks [Editor's Remarks]
IEEE Computational Intelligence Magazine ( IF 9 ) Pub Date : 2021-07-20 , DOI: 10.1109/mci.2021.3084387
Chuan-Kang Ting

The focus on evolutionary neural architecture search in this issue has driven me to ponder the evolution of biological neural networks, or rather, the human brain. Researchers in neuroanatomy found that the cognitive and mental development of humans is attributed to the increase of our brain size throughout evolution. Yet bigger is not always better. It was also discovered that, once the brain reaches a certain size, further growth will only render the brain less efficient. In addition, the brain is limited and affected by its inherent architecture and signal processing time. These discoveries in neuroscience are interestingly analogous to the advances in neural networks and evolutionary computation. Stacking of perceptrons empowers artificial neural networks to solve complex problems but detracts from their efficiency. Indeed, the human brain evolution and the artificial neural network evolution both face the trade-off between capacity and efficiency. The workings of nature truly give us a lot to mull over. This issue includes five Features articles. The first article proposes an evolutionary multi-objective model compression approach to simultaneously optimize the model size and accuracy of a deep neural network. The second and third articles adopt the notion of self-supervised learning in the neural architecture evolution, resulting in state-of-the-art performance. To improve the accuracy of wind speed forecasting, the fourth article uses an evolutionary algorithm to optimize the architecture of dendritic neural regression. The fifth article presents a self-adaptive mutation strategy for blockbased evolutionary neural architecture search on convolutional neural networks. In the Columns, the article proposes a multi-view feature construction method based on genetic programming and ensemble techniques.

中文翻译:

神经网络的演进【编者注】

本期对进化神经架构搜索的关注促使我思考生物神经网络的进化,或者更确切地说,人类大脑的进化。神经解剖学研究人员发现,人类的认知和心理发展归因于整个进化过程中大脑尺寸的增加。然而,更大并不总是更好。还发现,一旦大脑达到一定的大小,进一步的增长只会使大脑的效率降低。此外,大脑受到其固有结构和信号处理时间的限制和影响。神经科学中的这些发现有趣地类似于神经网络和进化计算的进步。感知器的堆叠使人工神经网络能够解决复杂的问题,但会降低其效率。的确,人脑进化和人工神经网络进化都面临着容量和效率之间的权衡。大自然的运作确实给了我们很多值得深思的地方。本期包括五篇专题文章。第一篇文章提出了一种进化的多目标模型压缩方法来同时优化深度神经网络的模型大小和准确性。第二篇和第三篇文章在神经架构进化中采用了自监督学习的概念,从而产生了最先进的性能。为了提高风速预测的准确性,第四篇文章使用进化算法来优化树突神经回归的架构。第五篇文章提出了一种自适应变异策略,用于卷积神经网络上基于块的进化神经架构搜索。在专栏中,文章提出了一种基于遗传编程和集成技术的多视图特征构建方法。
更新日期:2021-09-12
down
wechat
bug