当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Distributed kernel gradient descent algorithm for minimum error entropy principle
Applied and Computational Harmonic Analysis ( IF 2.5 ) Pub Date : 2019-01-15 , DOI: 10.1016/j.acha.2019.01.002
Ting Hu , Qiang Wu , Ding-Xuan Zhou

Distributed learning based on the divide and conquer approach is a powerful tool for big data processing. We introduce a distributed kernel gradient descent algorithm for the minimum error entropy principle and analyze its convergence. We show that the L2 error decays at a minimax optimal rate under some mild conditions. As a tool we establish some concentration inequalities for U-statistics which play pivotal roles in our error analysis.



中文翻译:

最小误差熵原理的分布式核梯度下降算法

基于分而治之方法的分布式学习是大数据处理的强大工具。针对最小误差熵原理,介绍了一种分布式核梯度下降算法,并对其收敛性进行了分析。我们表明大号2在某些温和条件下,误差以最小最大最佳速率衰减。作为一种工具,我们为U统计建立了一些集中不等式,这些不等式在我们的误差分析中起着关键作用。

更新日期:2019-01-15
down
wechat
bug