当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Parametric Deformable Exponential Linear Units for deep neural networks.
Neural Networks ( IF 6.0 ) Pub Date : 2020-02-26 , DOI: 10.1016/j.neunet.2020.02.012
Qishang Cheng 1 , HongLiang Li 1 , Qingbo Wu 1 , Lei Ma 1 , King Ngi Ngan 1
Affiliation  

Rectified activation units make an important contribution to the success of deep neural networks in many computer vision tasks. In this paper, we propose a Parametric Deformable Exponential Linear Unit (PDELU) and theoretically verify its effectiveness for improving the convergence speed of learning procedure. By means of flexible map shape, the proposed PDELU could push the mean value of activation responses closer to zero, which ensures the steepest descent in training a deep neural network. We verify the effectiveness of the proposed method in the image classification task. Extensive experiments on three classical databases (i.e., CIFAR-10, CIFAR-100, and ImageNet-2015) indicate that the proposed method leads to higher convergence speed and better accuracy when it is embedded into different CNN architectures (i.e., NIN, ResNet, WRN, and DenseNet). Meanwhile, the proposed PDELU outperforms many existing shape-specific activation functions (i.e., Maxout, ReLU, LeakyReLU, ELU, SELU, SoftPlus, Swish) and the shape-adaptive activation functions (i.e., APL, PReLU, MPELU, FReLU).



中文翻译:

用于深层神经网络的参数化可变形指数线性单位。

整流的激活单元在许多计算机视觉任务中为深层神经网络的成功做出了重要贡献。在本文中,我们提出了一种参数可变形指数线性单位(PDELU),并从理论上验证了其对提高学习过程收敛速度的有效性。通过灵活的映射形状,所提出的PDELU可以将激活响应的平均值推近零,从而确保在训练深度神经网络时最陡的下降。我们验证了该方法在图像分类任务中的有效性。在三个经典数据库(即CIFAR-10,CIFAR-100和ImageNet-2015)上进行的大量实验表明,该方法嵌入到不同的CNN架构(如NIN,ResNet, WRN,和DenseNet)。同时,提出的PDELU优于许多现有的特定形状激活函数(即Maxout,ReLU,LeakyReLU,ELU,SELU,SoftPlus,Swish)和形状自适应激活函数(即APL,PReLU,MPELU,FReLU)。

更新日期:2020-02-26
down
wechat
bug