当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Infinite-dimensional Folded-in-time Deep Neural Networks
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2021-01-08 , DOI: arxiv-2101.02966
Florian StelzerInstitute of Mathematics, Technische Universität Berlin, GermanyDepartment of Mathematics, Humboldt-Universität zu Berlin, Germany, Serhiy YanchukInstitute of Mathematics, Technische Universität Berlin, Germany

The method recently introduced in arXiv:2011.10115 realizes a deep neural network with just a single nonlinear element and delayed feedback. It is applicable for the description of physically implemented neural networks. In this work, we present an infinite-dimensional generalization, which allows for a more rigorous mathematical analysis and a higher flexibility in choosing the weight functions. Precisely speaking, the weights are described by Lebesgue integrable functions instead of step functions. We also provide a functional backpropagation algorithm, which enables gradient descent training of the weights. In addition, with a slight modification, our concept realizes recurrent neural networks.

中文翻译:

无限维及时折叠深层神经网络

arXiv:2011.10115中最近引入的方法实现了仅具有单个非线性元素和延迟反馈的深度神经网络。它适用于物理实现的神经网络的描述。在这项工作中,我们提出了一个无穷维概括,它允许进行更严格的数学分析,并在选择权重函数时具有更高的灵活性。准确地说,权重由Lebesgue可积函数而不是阶跃函数描述。我们还提供了一种功能性的反向传播算法,可实现权重的梯度下降训练。另外,通过稍作修改,我们的概念实现了递归神经网络。
更新日期:2021-01-11
down
wechat
bug