当前位置: X-MOL 学术Inform. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Distributed stochastic configuration networks with cooperative learning paradigm
Information Sciences ( IF 8.1 ) Pub Date : 2020-06-27 , DOI: 10.1016/j.ins.2020.05.112
Wu Ai , Dianhui Wang

As a new category of randomized neural networks (RNNs), stochastic configuration networks (SCNs) have demonstrated great potential for data analytics. Unlike conventional randomized learning techniques, e.g., random vector functional-link (RVFL) networks, SCNs provide a stochastic configuration mechanism on the assignment of input parameters which guarantees the universal approximation capability of a resulting learner model. In this paper, a distributed version of SCN is developed for decentralized datasets in cooperative learning paradigm. This paper proposes an approach to deal with datasets stored across a network of multiple learning agents without any fusion center. Specifically, we formulate the centralized learning problem as an equivalent form with the decomposition of subproblems coupled in a network and a consensus restriction. Then, a cooperative configuration scheme is proposed for randomly assigning the input weights and bias. Finally, based on the well-known parallel alternating direction method of multipliers (ADMM), the output weights are evaluated iteratively. Simulation studies with comparisons on three benchmark datasets are carried out. The experimental results indicate that our proposed learning scheme performs well and outperforms distributed RVFL networks.



中文翻译:

具有合作学习范式的分布式随机配置网络

作为一种新的随机神经网络(RNN),随机配置网络(SCN)已显示出巨大的数据分析潜力。与传统的随机学习技术(例如,随机矢量功能链接(RVFL)网络)不同,SCN在输入参数的分配上提供了一种随机配置机制,从而保证了所得学习者模型的通用逼近能力。本文针对协作学习范式中的分散数据集开发了SCN的分布式版本。本文提出了一种在没有任何融合中心的情况下处理跨多个学习代理网络存储的数据集的方法。具体来说,我们将集中式学习问题公式化为等效形式,将子问题分解成一个网络并达成共识限制。然后,提出了一种用于随机分配输入权重和偏差的协作配置方案。最后,基于众所周知的乘数并行交替方向方法(ADMM),对输出权重进行迭代评估。进行了对三个基准数据集进行比较的仿真研究。实验结果表明,我们提出的学习方案表现良好,并且优于分布式RVFL网络。

更新日期:2020-06-27
down
wechat
bug