当前位置: X-MOL 学术Signal Image Video Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Parametric rectified nonlinear unit (PRenu) for convolution neural networks
Signal, Image and Video Processing ( IF 2.3 ) Pub Date : 2020-07-23 , DOI: 10.1007/s11760-020-01746-9
Ilyas El Jaafari , Ayoub Ellahyani , Said Charfi

Activation function unit is an extremely important part of convolution neural networks; it is the nonlinear transformation that we do over the input data. Using hidden layer incorporating with a well-chosen activation function improves both the accuracy and the CNN convergence speed. This paper proposes a parametric rectified nonlinear function unit (PRenu). The proposed activation function is nearly similar to Relu. It returns $$x-\alpha \log (x+1)$$ for positive values ( $$\alpha $$ is between 0 and 1) and zero for negative parts. In contrast to Relu that returns the same received gradient for all positive values in its back-propagation, the PRenu multiplies it by values between $$1-\alpha $$ and 1 depending on the value with which each neuron was involved. The PRenu has been tested on three datasets: CIFAR-10, CIFAR-100 and Oxflower17, and compared to the activation function Relu. The experimental results show that using the proposed activation function PRenu, the CNN convergence is faster and the accuracy is also improved.

中文翻译:

用于卷积神经网络的参数整流非线性单元 (PRenu)

激活函数单元是卷积神经网络中极其重要的部分;我们对输入数据进行非线性变换。使用包含精心挑选的激活函数的隐藏层可以提高准确性和 CNN 收敛速度。本文提出了一种参数整流非线性函数单元(PRenu)。提议的激活函数与 Relu 几乎相似。它返回 $$x-\alpha \log (x+1)$$ 为正值( $$\alpha $$ 介于 0 和 1 之间)和零为负值。与 Relu 在其反向传播中为所有正值返回相同的接收梯度相反,PRenu 将其乘以 $$1-\alpha $$ 和 1 之间的值,具体取决于每个神经元所涉及的值。PRenu 已经在三个数据集上进行了测试:CIFAR-10、CIFAR-100 和 Oxflower17,并与激活函数 Relu 进行比较。实验结果表明,使用提出的激活函数PRenu,CNN收敛速度更快,准确率也有所提高。
更新日期:2020-07-23
down
wechat
bug