当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Accelerated Proximal Subsampled Newton Method
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.4 ) Pub Date : 2020-09-09 , DOI: 10.1109/tnnls.2020.3017555
Haishan Ye , Luo Luo , Zhihua Zhang

Composite function optimization problem often arises in machine learning known as regularized empirical minimization. We introduce the acceleration technique to the Newton-type proximal method and propose a novel algorithm called accelerated proximal subsampled Newton method ( APSSN ). APSSN only subsamples a small subset of samples to construct an approximate Hessian that achieves computational efficiency. At the same time, APSSN still keeps a fast convergence rate. Furthermore, we obtain the scaled proximal mapping by solving its dual problem using the semismooth Newton method instead of resorting to the first-order methods. Due to our sampling strategy and the fast convergence rate of the semismooth Newton method, we can get the scaled proximal mapping efficiently. Both our theoretical analysis and empirical study show that APSSN is an effective and computationally efficient algorithm for composite function optimization problems.

中文翻译:

加速近端二次采样牛顿法

复合函数优化问题经常出现在机器学习中,称为正则化经验最小化。我们将加速技术引入牛顿型近端方法,并提出了一种称为加速近端子采样牛顿法的新算法( APSSN )。 APSSN仅对一小部分样本进行子采样,以构建达到计算效率的近似 Hessian。同时,APSSN仍保持较快的收敛速度。此外,我们通过使用半光滑牛顿方法而不是求助于一阶方法来解决其对偶问题来获得缩放的近端映射。由于我们的采样策略和半光滑牛顿法的快速收敛速度,我们可以有效地获得缩放的近端映射。我们的理论分析和实证研究都表明,APSSN 是一种用于复合函数优化问题的有效且计算效率高的算法。
更新日期:2020-09-09
down
wechat
bug