当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Associative Memory in Iterated Overparameterized Sigmoid Autoencoders
arXiv - CS - Machine Learning Pub Date : 2020-06-30 , DOI: arxiv-2006.16540
Yibo Jiang, Cengiz Pehlevan

Recent work showed that overparameterized autoencoders can be trained to implement associative memory via iterative maps, when the trained input-output Jacobian of the network has all of its eigenvalue norms strictly below one. Here, we theoretically analyze this phenomenon for sigmoid networks by leveraging recent developments in deep learning theory, especially the correspondence between training neural networks in the infinite-width limit and performing kernel regression with the Neural Tangent Kernel (NTK). We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.

中文翻译:

迭代过参数化 Sigmoid 自动编码器中的联想记忆

最近的工作表明,当经过训练的网络输入-输出雅可比矩阵的所有特征值范数都严格低于 1 时,可以训练过度参数化的自动编码器以通过迭代映射实现关联记忆。在这里,我们通过利用深度学习理论的最新发展,从理论上分析 sigmoid 网络的这种现象,特别是在无限宽度限制下训练神经网络与使用神经切线核 (NTK) 执行核回归之间的对应关系。我们发现过度参数化的 sigmoid 自动编码器可以在 NTK 限制中具有吸引子,用于在特定条件下使用单个示例和多个示例进行训练。特别是,对于多个训练示例,我们发现最大雅可比特征值的范数随着输入范数的增加而降至 1 以下,
更新日期:2020-08-17
down
wechat
bug