当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A modularity comparison of Long Short-Term Memory and Morphognosis neural networks
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2021-04-23 , DOI: arxiv-2104.11410
Thomas E. Portegys

This study compares the modularity performance of two artificial neural network architectures: a Long Short-Term Memory (LSTM) recurrent network, and Morphognosis, a neural network based on a hierarchy of spatial and temporal contexts. Mazes are used to measure performance, defined as the ability to utilize independently learned mazes to solve mazes composed of them. A maze is a sequence of rooms connected by doors. The modular task is implemented as follows: at the beginning of the maze, an initial door choice forms a context that must be retained until the end of an intervening maze, where the same door must be chosen again to reach the goal. For testing, the door-association mazes and separately trained intervening mazes are presented together for the first time. While both neural networks perform well during training, the testing performance of Morphognosis is significantly better than LSTM on this modular task.

中文翻译:

长短期记忆和形态诊断神经网络的模块化比较

这项研究比较了两种人工神经网络架构的模块化性能:长短期记忆(LSTM)递归网络和Morphognosis,这是一种基于时空上下文层次结构的神经网络。迷宫用于衡量性能,定义为利用独立学习的迷宫解决由迷宫组成的迷宫的能力。迷宫是指一系列由门相连的房间。模块化任务的实现方式如下:在迷宫的开始处,最初的门选择会形成一个上下文,必须保留该上下文,直到中间的迷宫结束为止,在迷宫的迷宫中,必须再次选择同一扇门才能达到目标。为了进行测试,首次将门关联迷宫和经过单独培训的中间迷宫一起展示。虽然两个神经网络在训练期间都表现良好,
更新日期:2021-04-26
down
wechat
bug