当前位置: X-MOL 学术J. Ind. Manage. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Improved SVRG for finite sum structure optimization with application to binary classification
Journal of Industrial and Management Optimization ( IF 1.2 ) Pub Date : 2019-05-29 , DOI: 10.3934/jimo.2019052
Guangmei Shao , , Wei Xue , Gaohang Yu , Xiao Zheng ,

This paper looks at a stochastic variance reduced gradient (SVRG) method for minimizing the sum of a finite number of smooth convex functions, which has been involved widely in the field of machine learning and data mining. Inspired by the excellent performance of two-point stepsize gradient method in batch learning, in this paper we present an improved SVRG algorithm, named stochastic two-point stepsize gradient method. Under some mild conditions, the proposed method achieves a linear convergence rate $ O(\rho^k) $ for smooth and strongly convex functions, where $ \rho\in(0.68, 1) $. Simulation experiments on several benchmark data sets are reported to demonstrate the performance of the proposed method.

中文翻译:

用于有限和结构优化的改进SVRG及其在二元分类中的应用

本文研究了一种用于最小化有限数量的光滑凸函数之和的随机方差减小梯度(SVRG)方法,该方法已在机器学习和数据挖掘领域广泛使用。受两点逐步梯度法在批处理学习中的出色表现的启发,本文提出了一种改进的SVRG算法,即随机两点逐步梯度法。在某些温和条件下,所提出的方法针对光滑和强凸函数实现了线性收敛速率$ O(\ rho ^ k)$,其中$ \ rho \ in(0.68,1)$。据报道,在几个基准数据集上的仿真实验证明了该方法的性能。
更新日期:2019-05-29
down
wechat
bug