当前位置: X-MOL 学术Trends Cogn. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Avoiding Catastrophic Forgetting
Trends in Cognitive Sciences ( IF 19.9 ) Pub Date : 2017-06-01 , DOI: 10.1016/j.tics.2017.04.001
Michael E. Hasselmo

Humans regularly perform new learning without losing memory for previous information, but neural network models suffer from the phenomenon of catastrophic forgetting in which new learning impairs prior function. A recent article presents an algorithm that spares learning at synapses important for previously learned function, reducing catastrophic forgetting.

中文翻译:

避免灾难性遗忘

人类定期进行新的学习,而不会失去对先前信息的记忆,但神经网络模型会遭受灾难性遗忘现象,其中新的学习会损害先验功能。最近的一篇文章提出了一种算法,该算法可以避免在对先前学习的功能很重要的突触上进行学习,从而减少灾难性的遗忘。
更新日期:2017-06-01
down
wechat
bug