当前位置: X-MOL 学术arXiv.cs.CG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks
arXiv - CS - Computational Geometry Pub Date : 2020-08-20 , DOI: arxiv-2008.09052
J. Elisenda Grigsby and Kathryn Lindsey

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known that for any choice of parameters, F is continuous and piecewise (affine) linear. We lay some foundations for a systematic investigation of how the architecture of F impacts the geometry and topology of its possible decision regions for binary classification tasks. Following the classical progression for smooth functions in differential topology, we first define the notion of a generic, transversal ReLU neural network and show that almost all ReLU networks are generic and transversal. We then define a partially-oriented linear 1-complex in the domain of F and identify properties of this complex that yield an obstruction to the existence of bounded connected components of a decision region. We use this obstruction to prove that a decision region of a generic, transversal ReLU network F: R^n -> R with a single hidden layer of dimension (n + 1) can have no more than one bounded connected component.

中文翻译:

关于弯曲超平面排列的横向性和 ReLU 神经网络的拓扑表达能力

设 F:R^n -> R 为前馈 ReLU 神经网络。众所周知,对于任何参数选择,F 都是连续的和分段(仿射)线性的。我们为系统研究 F 的架构如何影响其二元分类任务的可能决策区域的几何和拓扑奠定了一些基础。遵循微分拓扑中平滑函数的经典进展,我们首先定义了泛型横向 ReLU 神经网络的概念,并表明几乎所有 ReLU 网络都是泛型和横向的。然后我们在 F 的域中定义一个部分定向的线性 1-复形,并确定这个复形的特性,这些特性会阻碍决策区域的有界连通分量的存在。我们使用这个障碍来证明泛型的决策区域,
更新日期:2020-08-21
down
wechat
bug