当前位置: X-MOL 学术Appl. Netw. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Enhancing network modularity to mitigate catastrophic forgetting
Applied Network Science Pub Date : 2020-11-26 , DOI: 10.1007/s41109-020-00332-9
Lu Chen , Masayuki Murata

Catastrophic forgetting occurs when learning algorithms change connections used to encode previously acquired skills to learn a new skill. Recently, a modular approach for neural networks was deemed necessary as learning problems grow in scale and complexity since it intuitively should reduce learning interference by separating functionality into physically distinct network modules. However, an algorithmic approach is difficult in practice since it involves expert design and trial and error. Kashtan et al. finds that evolution under an environment that changes in a modular fashion leads to the spontaneous evolution of a modular network structure. In this paper, we aim to solve the reverse problem of modularly varying goal (MVG) to obtain a highly modular structure that can mitigate catastrophic forgetting so that it can also apply to realistic data. First, we confirm that a configuration with a highly modular structure exists by applying an MVG against a realistic dataset and confirm that this neural network can mitigate catastrophic forgetting. Next, we solve the reverse problem, that is, we propose a method that can obtain a highly modular structure able to mitigate catastrophic forgetting. Since the MVG-obtained neural network can relatively maintain the intra-module elements while leaving the inter-module elements relatively variable, we propose a method to restrict the inter-module weight elements so that they can be relatively variable against the intra-module ones. From the results, the obtained neural network has a highly modular structure and can learn an unlearned goal faster than without this method.



中文翻译:

增强网络模块化以减轻灾难性遗忘

当学习算法更改用于对先前获得的技能进行编码以学习新技能的连接时,就会发生灾难性的遗忘。最近,随着学习问题的规模和复杂性的增长,神经网络的模块化方法被认为是必要的,因为它可以通过将功能分为物理上不同的网络模块来直观地减少学习干扰。但是,算法方法在实践中很困难,因为它涉及专家设计和反复试验。Kashtan等。他发现,在以模块化方式变化的环境下的演进会导致模块化网络结构的自发演进。在本文中,我们旨在解决模块化可变目标(MVG)的反向问题,以获得可减轻灾难性遗忘的高度模块化结构,使其也可应用于现实数据。首先,通过对实际数据集应用MVG,我们确认存在具有高度模块化结构的配置,并确认该神经网络可以减轻灾难性遗忘。接下来,我们解决了相反的问题,即,提出了一种方法,该方法可以获得能够减轻灾难性遗忘的高度模块化的结构。由于获得的MVG神经网络可以在保持模块间元素相对可变的同时相对保持模块内元素,因此我们提出了一种方法来限制模块间权重元素,使其相对于模块内元素可以相对可变。 。从结果来看

更新日期:2020-11-27
down
wechat
bug