当前位置: X-MOL 学术J. Fourier Anal. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Parseval Proximal Neural Networks
Journal of Fourier Analysis and Applications ( IF 1.2 ) Pub Date : 2020-07-06 , DOI: 10.1007/s00041-020-09761-7
Marzieh Hasannasab , Johannes Hertrich , Sebastian Neumayer , Gerlind Plonka , Simon Setzer , Gabriele Steidl

The aim of this paper is twofold. First, we show that a certain concatenation of a proximity operator with an affine operator is again a proximity operator on a suitable Hilbert space. Second, we use our findings to establish so-called proximal neural networks (PNNs) and stable tight frame proximal neural networks. Let \(\mathcal {H}\) and \(\mathcal {K}\) be real Hilbert spaces, \(b \in \mathcal {K}\) and \(T \in \mathcal {B} (\mathcal {H},\mathcal {K})\) a linear operator with closed range and Moore–Penrose inverse \(T^\dagger \). Based on the well-known characterization of proximity operators by Moreau, we prove that for any proximity operator \(\mathrm {Prox}:\mathcal {K}\rightarrow \mathcal {K}\) the operator \(T^\dagger \, \mathrm {Prox}( T \cdot + b)\) is a proximity operator on \(\mathcal {H}\) equipped with a suitable norm. In particular, it follows for the frequently applied soft shrinkage operator \(\mathrm {Prox}= S_{\lambda }:\ell _2 \rightarrow \ell _2\) and any frame analysis operator \(T:\mathcal {H}\rightarrow \ell _2\) that the frame shrinkage operator \(T^\dagger \, S_\lambda \, T\) is a proximity operator on a suitable Hilbert space. The concatenation of proximity operators on \(\mathbb R^d\) equipped with different norms establishes a PNN. If the network arises from tight frame analysis or synthesis operators, then it forms an averaged operator. In particular, it has Lipschitz constant 1 and belongs to the class of so-called Lipschitz networks, which were recently applied to defend against adversarial attacks. Moreover, due to its averaging property, PNNs can be used within so-called Plug-and-Play algorithms with convergence guarantee. In case of Parseval frames, we call the networks Parseval proximal neural networks (PPNNs). Then, the involved linear operators are in a Stiefel manifold and corresponding minimization methods can be applied for training of such networks. Finally, some proof-of-the concept examples demonstrate the performance of PPNNs.

中文翻译:

稀疏近端神经网络

本文的目的是双重的。首先,我们证明了一个近似算子与一个仿射算子的某种级联又是一个合适的希尔伯特空间上的一个接近算子。第二,我们利用我们的发现建立所谓的近端神经网络(PNN)和稳定的紧框架近端神经网络。令\(\ mathcal {H} \)\(\ mathcal {K} \)为实Hilbert空间,\(b \ in \ mathcal {K} \)\(T \ in \ mathcal {B}(\ mathcal {H},\ mathcal {K})\)带有封闭范围和Moore-Penrose逆\(T ^ \ dagger \)的线性算子。基于Moreau对邻近算子的著名描述,我们证明对于任何邻近算子\(\ mathrm {Prox}:\ mathcal {K} \ rightarrow \ mathcal {K} \)运算符\(T ^ \ dagger \,\ mathrm {Prox}(T \ cdot + b)\)是一个接近运算符上\(\ mathcal {H} \)配备有合适的范数。特别是对于经常使用的软收缩算子\(\ mathrm {Prox} = S _ {\ lambda}:\ ell _2 \ rightarrow \ ell _2 \)和任何帧分析算子\(T:\ mathcal {H} \ rightarrow \ ell _2 \)表示帧收缩算子\(T ^ \ dagger \,S_ \ lambda \,T \)是适合的希尔伯特空间上的邻近算子。\(\ mathbb R ^ d \)上的近似运算符的级联具备不同规范的国家/地区建立了PNN。如果网络来自严格的框架分析或综合算子,则它将形成一个平均算子。特别是,它具有Lipschitz常数1,并且属于所谓的Lipschitz网络,最近被用于防御对抗性攻击。此外,由于其平均特性,PNN可以在具有收敛保证的所谓即插即用算法中使用。在Parseval框架的情况下,我们将网络称为Parseval近端神经网络(PPNN)。然后,将所涉及的线性算子放在Stiefel流形中,并且可以将相应的最小化方法应用于此类网络的训练。最后,一些概念验证示例演示了PPNN的性能。
更新日期:2020-07-06
down
wechat
bug