当前位置: X-MOL 学术Proc. Natl. Acad. Sci. U.S.A. › 论文详情
Overparameterized neural networks implement associative memory [Computer Sciences]
Proceedings of the National Academy of Sciences of the United States of America ( IF 9.412 ) Pub Date : 2020-10-16 , DOI: 10.1073/pnas.2005013117
Adityanarayanan Radhakrishnan, Mikhail Belkin, Caroline Uhler

Identifying computational mechanisms for memorization and retrieval of data is a long-standing problem at the intersection of machine learning and neuroscience. Our main finding is that standard overparameterized deep neural networks trained using standard optimization methods implement such a mechanism for real-valued data. Empirically, we show that 1) overparameterized autoencoders store training samples as attractors and thus, iterating the learned map leads to sample recovery, and that 2) the same mechanism allows for encoding sequences of examples and serves as an even more efficient mechanism for memory than autoencoding. Theoretically, we prove that when trained on a single example, autoencoders store the example as an attractor. Lastly, by treating a sequence encoder as a composition of maps, we prove that sequence encoding provides a more efficient mechanism for memory than autoencoding.

更新日期:2020-10-17

 

全部期刊列表>>
Springer 纳米技术权威期刊征稿
全球视野覆盖
施普林格·自然新
chemistry
3分钟学术视频演讲大赛
物理学研究前沿热点精选期刊推荐
自然职位线上招聘会
欢迎报名注册2020量子在线大会
化学领域亟待解决的问题
材料学研究精选新
GIANT
ACS ES&T Engineering
ACS ES&T Water
屿渡论文,编辑服务
ACS Publications填问卷
阿拉丁试剂right
西北大学
大连理工大学
湖南大学
华东师范大学
王要兵
浙江大学
隐藏1h前已浏览文章
课题组网站
新版X-MOL期刊搜索和高级搜索功能介绍
ACS材料视界
天合科研
x-mol收录
陆军军医大学
李霄鹏
廖矿标
试剂库存
down
wechat
bug