当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Un-rectifying Non-linear Networks for Signal Representation
IEEE Transactions on Signal Processing ( IF 5.4 ) Pub Date : 2020-01-01 , DOI: 10.1109/tsp.2019.2957607
Wen-Liang Hwang , Andreas Heinecke

We consider deep neural networks with rectifier activations and max-pooling from a signal representation perspective. In this view, such representations mark the transition from using a single linear representation for all signals to utilizing a large collection of affine linear representations that are tailored to particular regions of the signal space. We propose a novel technique to “un-rectify” the nonlinear activations into data-dependent linear equations and constraints, from which we derive explicit expressions for the affine linear operators, their domains and ranges in terms of the network parameters. We show how increasing the depth of the network refines the domain partitioning and derive atomic decompositions for the corresponding affine mappings that process data belonging to the same partitioning region. In each atomic decomposition the connections over all hidden network layers are summarized and interpreted in a single matrix. We apply the decompositions to study the Lipschitz regularity of the networks and give sufficient conditions for network-depth-independent stability of the representation, drawing a connection to compressible weight distributions. Such analyses may facilitate and promote further theoretical insight and exchange from both the signal processing and machine learning communities.

中文翻译:

用于信号表示的非整流非线性网络

我们从信号表示的角度考虑具有整流器激活和最大池化的深度神经网络。从这个角度来看,这样的表示标志着从对所有信号使用单一线性表示到使用针对信号空间的特定区域定制的大量仿射线性表示的转变。我们提出了一种新技术,将非线性激活“取消校正”为数据相关的线性方程和约束,从中我们推导出仿射线性算子的显式表达式,它们的域和网络参数范围。我们展示了增加网络深度如何细化域分区并为处理属于同一分区区域的数据的相应仿射映射导出原子分解。在每个原子分解中,所有隐藏网络层上的连接都在单个矩阵中汇总和解释。我们应用分解来研究网络的 Lipschitz 正则性,并为表示的网络深度独立稳定性提供充分条件,绘制与可压缩权重分布的联系。这种分析可以促进和促进信号处理和机器学习社区的进一步理论洞察和交流。
更新日期:2020-01-01
down
wechat
bug