当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Estimate of the Neural Network Dimension Using Algebraic Topology and Lie Theory
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-04-06 , DOI: arxiv-2004.02881
Luciano Melodia, Richard Lenz

In this paper we present an approach to determine the smallest possible number of neurons in a layer of a neural network in such a way that the topology of the input space can be learned sufficiently well. We introduce a general procedure based on persistent homology to investigate topological invariants of the manifold on which we suspect the data set. We specify the required dimensions precisely, assuming that there is a smooth manifold on or near which the data are located. Furthermore, we require that this space is connected and has a commutative group structure in the mathematical sense. These assumptions allow us to derive a decomposition of the underlying space whose topology is well known. We use the representatives of the $k$-dimensional homology groups from the persistence landscape to determine an integer dimension for this decomposition. This number is the dimension of the embedding that is capable of capturing the topology of the data manifold. We derive the theory and validate it experimentally on toy data sets.

中文翻译:

使用代数拓扑和李理论估计神经网络维数

在本文中,我们提出了一种方法来确定神经网络的一层中可能的最小神经元数量,从而可以足够好地学习输入空间的拓扑结构。我们引入了一个基于持久同源性的一般程序来研究我们怀疑数据集的流形的拓扑不变量。我们精确地指定所需的维度,假设数据所在的位置或附近有一个平滑的流形。此外,我们要求这个空间是连通的,并且具有数学意义上的可交换群结构。这些假设使我们能够推导出其拓扑结构众所周知的底层空间的分解。我们使用持久性景观中 $k$ 维同源组的代表来确定此分解的整数维度。这个数字是能够捕获数据流形拓扑的嵌入维度。我们推导出理论并在玩具数据集上进行实验验证。
更新日期:2020-11-17
down
wechat
bug