当前位置:
X-MOL 学术
›
arXiv.cs.NE
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Synaptic metaplasticity in binarized neural networks
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2021-01-19 , DOI: arxiv-2101.07592 Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin, Damien Querlioz
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2021-01-19 , DOI: arxiv-2101.07592 Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin, Damien Querlioz
Unlike the brain, artificial neural networks, including state-of-the-art deep
neural networks for computer vision, are subject to "catastrophic forgetting":
they rapidly forget the previous task when trained on a new one. Neuroscience
suggests that biological synapses avoid this issue through the process of
synaptic consolidation and metaplasticity: the plasticity itself changes upon
repeated synaptic events. In this work, we show that this concept of
metaplasticity can be transferred to a particular type of deep neural networks,
binarized neural networks, to reduce catastrophic forgetting.
中文翻译:
二值神经网络的突触可塑性
与大脑不同,人工神经网络(包括用于计算机视觉的最先进的深度神经网络)容易遭受“灾难性的遗忘”:在接受新的训练时,它们会迅速忘记先前的任务。神经科学表明,生物突触可通过突触巩固和可塑化过程避免此问题:可塑性本身在重复的突触事件后会发生变化。在这项工作中,我们证明了这种可塑性的概念可以转移到特定类型的深层神经网络,即二值化神经网络,以减少灾难性的遗忘。
更新日期:2021-01-20
中文翻译:
二值神经网络的突触可塑性
与大脑不同,人工神经网络(包括用于计算机视觉的最先进的深度神经网络)容易遭受“灾难性的遗忘”:在接受新的训练时,它们会迅速忘记先前的任务。神经科学表明,生物突触可通过突触巩固和可塑化过程避免此问题:可塑性本身在重复的突触事件后会发生变化。在这项工作中,我们证明了这种可塑性的概念可以转移到特定类型的深层神经网络,即二值化神经网络,以减少灾难性的遗忘。