当前位置: X-MOL 学术Int. J. Imaging Syst. Technol. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The influence of the activation function in a capsule network for brain tumor type classification
International Journal of Imaging Systems and Technology ( IF 3.3 ) Pub Date : 2021-08-10 , DOI: 10.1002/ima.22638
Kwabena Adu 1 , Yongbin Yu 1 , Jingye Cai 1 , Isaac Asare 2 , Jennifer Quahin 3
Affiliation  

Capsule network's hierarchical framework (CapsNets) consists of an initial standard convolution layer that uses an activation function at its core. The rectified linear unit (ReLU) activation function is widely used in CapsNet and brain tumor classification tasks among several existing activation functions. However, ReLU has some shortcomings where the zero derivatives of the function cause failure of neuron activation. Furthermore, the performance accuracy obtained by the ReLU with CapsNet on brain tumor classification is unsatisfactory. We proposed a new activation function called parametric scaled hyperbolic tangent (PSTanh), which enhances the conventional hyperbolic tangent by avoiding vanishing gradient, provides a small gradient with the introduction of urn:x-wiley:08999457:media:ima22638:ima22638-math-0001 and urn:x-wiley:08999457:media:ima22638:ima22638-math-0002 parameters, and enables faster optimization. Eight standard activation functions (i.e., tanh, Memrister-Like Activation Function (ReLU), Leaky-ReLU, PReLU, ELU, SELU, Swish, ReLU-Memrister-Like Activation Function (RMAF), and the proposed activation) are analyzed and compared in brain tumor classification tasks. Furthermore, extensive experiments are conducted using MNIST, fashion-MNIST, CIFAR-10, CIFAR-100, and ImageNet datasets trained on CapsNets models and deep CNN models (i.e., AlexNet, SqueezeNet, ResNet50, and DenseNet121). The brain tumor's experimental results based on CapsNet and CNN model show that the proposed PSTanh activation achieves better performance than other functions.

中文翻译:

胶囊网络中激活函数对脑肿瘤类型分类的影响

胶囊网络的分层框架 (CapsNets) 由一个初始标准卷积层组成,该层在其核心使用激活函数。在现有的几个激活函数中,整流线性单元 (ReLU) 激活函数被广泛用于 CapsNet 和脑肿瘤分类任务。然而,ReLU 存在一些缺点,即函数的零导数导致神经元激活失败。此外,ReLU 与 CapsNet 在脑肿瘤分类上获得的性能准确性并不令人满意。我们提出了一种新的激活函数,称为参数缩放双曲正切(PSTanh),它通过避免梯度消失来增强传统的双曲正切,通过引入骨灰盒:x-wiley:08999457:媒体:ima22638:ima22638-math-0001骨灰盒:x-wiley:08999457:媒体:ima22638:ima22638-math-0002参数,并实现更快的优化。分析和比较了八种标准激活函数(即 tanh、Memrister-Like Activation Function (ReLU)、Leaky-ReLU、PReLU、ELU、SELU、Swish、ReLU-Memrister-Like Activation Function (RMAF) 和建议的激活)在脑肿瘤分类任务中。此外,使用在 CapsNets 模型和深度 CNN 模型(即 AlexNet、SqueezeNet、ResNet50 和 DenseNet121)上训练的 MNIST、fashion-MNIST、CIFAR-10、CIFAR-100 和 ImageNet 数据集进行了广泛的实验。基于 CapsNet 和 CNN 模型的脑肿瘤实验结果表明,所提出的 PSTanh 激活比其他函数实现了更好的性能。
更新日期:2021-08-10
down
wechat
bug