当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Metric entropy limits on recurrent neural network learning of linear dynamical systems
Applied and Computational Harmonic Analysis ( IF 2.5 ) Pub Date : 2021-12-20 , DOI: 10.1016/j.acha.2021.12.004
Clemens Hutter 1 , Recep Gül 1 , Helmut Bölcskei 1
Affiliation  

One of the most influential results in neural network theory is the universal approximation theorem [1], [2], [3] which states that continuous functions can be approximated to within arbitrary accuracy by single-hidden-layer feedforward neural networks. The purpose of this paper is to establish a result in this spirit for the approximation of general discrete-time linear dynamical systems—including time-varying systems—by recurrent neural networks (RNNs). For the subclass of linear time-invariant (LTI) systems, we devise a quantitative version of this statement. Specifically, measuring the complexity of the considered class of LTI systems through metric entropy according to [4], we show that RNNs can optimally learn—or identify in system-theory parlance—stable LTI systems. For LTI systems whose input-output relation is characterized through a difference equation, this means that RNNs can learn the difference equation from input-output traces in a metric-entropy optimal manner.



中文翻译:

线性动力系统递归神经网络学习的度量熵限制

神经网络理论中最具影响力的结果之一是通用逼近定理 [1]、[2]、[3],它指出连续函数可以通过单隐藏层前馈神经网络在任意精度范围内逼近。本文的目的是本着这种精神建立一个结果,用于通过循环神经网络 (RNN) 逼近一般离散时间线性动力系统(包括时变系统)。对于线性时不变 (LTI) 系统的子类,我们设计了此语句的定量版本。具体来说,根据 [4] 通过度量熵测量所考虑的 LTI 系统类别的复杂性,我们表明 RNN 可以最优地学习——或者用系统理论的说法识别——稳定的 LTI 系统。

更新日期:2021-12-20
down
wechat
bug