当前位置: X-MOL 学术J. Electron. Imaging › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Denoising convolutional neural network inspired via multi-layer convolutional sparse coding
Journal of Electronic Imaging ( IF 1.1 ) Pub Date : 2021-03-01 , DOI: 10.1117/1.jei.30.2.023007
Zejia Wen 1 , Hailin Wang 1 , Yingfan Gong 2 , Jianjun Wang 3
Affiliation  

Sparse prior to image denoising is a classical research field with a long history in computer vision. We propose an end-to-end supervised neural network, named DnMLCSC-net, which is inspired via multi-layer convolutional sparse coding model embedded with symbiotic analysis–synthesis priors for natural image denoising. Unfolding a multi-layer, learned iterative soft thresholding algorithm (ML-LISTA) and developing into a convolutional recurrent neural network, all parameters in the model are updated adaptively to minimize mixed loss via gradient descent using backpropagation. In addition, a combined ReLU function is taken as the activation function. Inconsistent dilated convolution and batch normalization were empirically introduced into the encoding layers corresponding to the first iteration of ML-LISTA. Experimental results show that our network achieves a competitive denoising effect in comparison with several state-of-the-art denoising methods.

中文翻译:

多层卷积稀疏编码启发去噪卷积神经网络

图像降噪之前的稀疏是计算机视觉领域历史悠久的经典研究领域。我们提出了一个名为DnMLCSC-net的端到端监督神经网络,该网络的灵感来自嵌入了共生分析的先天性多层卷积稀疏编码模型-自然图像去噪的合成先验。展开多层学习迭代软阈值算法(ML-LISTA)并发展为卷积递归神经网络,该模型中的所有参数均会自适应更新,以使用反向传播通过梯度下降使混合损失最小化。另外,将组合的ReLU功能用作激活功能。经验性地将不一致的膨胀卷积和批归一化引入与ML-LISTA的第一次迭代相对应的编码层中。
更新日期:2021-03-17
down
wechat
bug