当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Incremental Without Replacement Sampling in Nonconvex Optimization
Journal of Optimization Theory and Applications ( IF 1.9 ) Pub Date : 2021-06-21 , DOI: 10.1007/s10957-021-01883-2
Edouard Pauwels

Minibatch decomposition methods for empirical risk minimization are commonly analyzed in a stochastic approximation setting, also known as sampling with replacement. On the other hand, modern implementations of such techniques are incremental: they rely on sampling without replacement, for which available analysis is much scarcer. We provide convergence guaranties for the latter variant by analyzing a versatile incremental gradient scheme. For this scheme, we consider constant, decreasing or adaptive step sizes. In the smooth setting, we obtain explicit complexity estimates in terms of epoch counter. In the nonsmooth setting, we prove that the sequence is attracted by solutions of optimality conditions of the problem.



中文翻译:

非凸优化中的增量无替换采样

用于经验风险最小化的小批量分解方法通常在随机近似设置中进行分析,也称为替换抽样。另一方面,这些技术的现代实现是渐进式的:它们依赖于没有替换的采样,可用的分析非常稀缺。我们通过分析通用的增量梯度方案为后一种变体提供收敛保证。对于此方案,我们考虑恒定、递减或自适应步长。在平滑设置中,我们根据纪元计数器获得明确的复杂性估计。在非光滑设置中,我们证明序列被问题的最优条件的解所吸引。

更新日期:2021-06-22
down
wechat
bug