当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Gossip-based distributed stochastic mirror descent for constrained optimization
Neural Networks ( IF 7.8 ) Pub Date : 2024-04-05 , DOI: 10.1016/j.neunet.2024.106291
Xianju Fang , Baoyong Zhang , Deming Yuan

This paper considers a distributed constrained optimization problem over a multi-agent network in the non-Euclidean sense. The gossip protocol is adopted to relieve the communication burden, which also adapts to the constantly changing topology of the network. Based on this idea, a gossip-based distributed stochastic mirror descent (GB-DSMD) algorithm is proposed to handle the problem under consideration. The performances of GB-DSMD algorithms with constant and diminishing step sizes are analyzed, respectively. When the step size is constant, the error bound between the optimal function value and the expected function value corresponding to the average iteration output of the algorithm is derived. While for the case of the diminishing step size, it is proved that the output of the algorithm uniformly approaches to the optimal value with probability 1. Finally, as a numerical example, the distributed logistic regression is reported to demonstrate the effectiveness of the GB-DSMD algorithm.

中文翻译:

用于约束优化的基于 Gossip 的分布式随机镜像下降

本文考虑非欧几里德意义上的多智能体网络上的分布式约束优化问题。采用gossip协议来减轻通信负担,也适应网络拓扑结构的不断变化。基于这个思想,提出了一种基于八卦的分布式随机镜像下降(GB-DSMD)算法来处理所考虑的问题。分别分析了步长恒定和递减的 GB-DSMD 算法的性能。当步长一定时,求得算法平均迭代输出对应的最优函数值与期望函数值之间的误差界。而对于步长递减的情况,证明算法的输出均匀地以概率1逼近最优值。最后,作为一个数值例子,报告了分布式逻辑回归来证明GB- DSMD 算法。
更新日期:2024-04-05
down
wechat
bug