当前位置: X-MOL 学术IEEE Signal Proc. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Neural Networks, Hypersurfaces, and the Generalized Radon Transform [Lecture Notes]
IEEE Signal Processing Magazine ( IF 14.9 ) Pub Date : 2020-07-01 , DOI: 10.1109/msp.2020.2978822
Soheil Kolouri 1 , Xuwang Yin 2 , Gustavo K Rohde 2
Affiliation  

Artificial neural networks (ANNs) have long been used as a mathematical modeling method and have recently found numerous applications in science and technology, including computer vision, signal processing, and machine learning [1], to name a few. Although notable function approximation results exist [2], theoretical explanations have yet to catch up with newer developments, particularly with regard to (deep) hierarchical learning. As a consequence, numerous doubts often accompany NN practitioners, such as How many layers should one use? What is the effect of different activation functions? What are the effects of pooling? and many others.

中文翻译:

神经网络、超曲面和广义 Radon 变换 [讲义]

人工神经网络 (ANN) 长期以来一直被用作一种数学建模方法,并且最近在科学和技术中发现了许多应用,包括计算机视觉、信号处理和机器学习 [1],仅举几例。尽管存在显着的函数逼近结果 [2],但理论解释尚未赶上更新的发展,尤其是在(深度)分层学习方面。因此,神经网络从业者经常会遇到许多疑问,例如应该使用多少层?不同激活函数的作用是什么?池化有什么影响?和许多其他人。
更新日期:2020-07-01
down
wechat
bug