当前位置: X-MOL 学术Artif. Intell. Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Feed-forward versus recurrent architecture and local versus cellular automata distributed representation in reservoir computing for sequence memory learning
Artificial Intelligence Review ( IF 10.7 ) Pub Date : 2020-02-12 , DOI: 10.1007/s10462-020-09815-8
Mrwan Margem , Osman S. Gedik

Reservoir computing based on cellular automata (ReCA) constructs a novel bridge between automata computational theory and recurrent neural networks. ReCA has been trained to solve 5-bit memory tasks. Several methods are proposed to implement the reservoir where the distributed representation of cellular automata (CA) in recurrent architecture could solve the 5-bit tasks with minimum complexity and minimum number of training examples. CA distributed representation in recurrent architecture outperforms the local representation in recurrent architecture (stack reservoir), then echo state networks and feed-forward architecture using local or distributed representation. Extracted features from the reservoir, using the natural diffusion of CA states in the reservoir offers the state-of-the-art results in terms of feature vector length and the required training examples. Another extension is obtained by combining the reservoir CA states using XOR, Binary or Gray operator to produce a single feature vector to reduce the feature space. This method gives promising results, however using the natural diffusion of CA states still outperform. ReCA can be considered to operate around the lower bound of complexity; due to using the elementary CA in the reservoir.

中文翻译:

用于序列记忆学习的储层计算中的前馈与循环架构以及局部与元胞自动机分布式表示

基于元胞自动机 (ReCA) 的水库计算构建了自动机计算理论和循环神经网络之间的新桥梁。ReCA 已经过训练,可以解决 5 位内存任务。提出了几种方法来实现存储库,其中循环架构中的元胞自动机 (CA) 的分布式表示可以以最小的复杂性和最少的训练示例数来解决 5 位任务。循环架构中的 CA 分布式表示优于循环架构(堆栈存储库)中的本地表示,然后使用本地或分布式表示回声状态网络和前馈架构。从水库中提取特征,使用水库中 CA 状态的自然扩散在特征向量长度和所需的训练示例方面提供了最先进的结果。另一种扩展是通过使用 XOR、二元或格雷算子组合储层 CA 状态以产生单个特征向量来减少特征空间而获得的。这种方法给出了有希望的结果,但是使用 CA 状态的自然扩散仍然优于。ReCA 可以被认为是在复杂性下限附近运行;由于在储层中使用基本 CA。ReCA 可以被认为是在复杂性下限附近运行;由于在储层中使用基本 CA。ReCA 可以被认为是在复杂性下限附近运行;由于在储层中使用基本 CA。
更新日期:2020-02-12
down
wechat
bug