当前位置: X-MOL 学术Statistics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the rates of convergence of parallelized averaged stochastic gradient algorithms
Statistics ( IF 1.9 ) Pub Date : 2020-05-03 , DOI: 10.1080/02331888.2020.1764557
Antoine Godichon-Baggioni 1 , Sofiane Saadane 2
Affiliation  

ABSTRACT The growing interest for high-dimensional and functional data analysis led in the last decade to important research developing a consequent amount of techniques. Parallelized algorithms, which consist of distributing and treat the data into different machines, for example, are a good answer to deal with large samples taking values in high-dimensional spaces. We introduce here a parallelized averaged stochastic gradient algorithm, which enables to treat efficiently and recursively the data, and so, without taking care if the distribution of the data into the machines is uniform. The rate of convergence in quadratic mean, as well as the asymptotic normality of the parallelized estimates are given, for strongly and locally strongly convex objectives.

中文翻译:

关于并行化平均随机梯度算法的收敛速度

摘要 在过去十年中,人们对高维和功能数据分析日益增长的兴趣导致了重要的研究开发了大量的技术。例如,并行化算法包括将数据分布和处理到不同的机器中,是处理在高维空间中取值的大样本的好方法。我们在此介绍了一种并行化平均随机梯度算法,该算法能够高效地递归​​处理数据,因此无需注意数据在机器中的分布是否均匀。对于强凸目标和局部强凸目标,给出了二次均值的收敛速度以及并行估计的渐近正态性。
更新日期:2020-05-03
down
wechat
bug