当前位置: X-MOL 学术Stat. Anal. Data Min. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Kernel learning with nonconvex ramp loss
Statistical Analysis and Data Mining ( IF 2.1 ) Pub Date : 2022-06-08 , DOI: 10.1002/sam.11588
Xijun Liang 1 , Zhipeng Zhang 1 , Xingke Chen 1 , Ling Jian 1
Affiliation  

We study the kernel learning problems with ramp loss, a nonconvex but noise-resistant loss function. In this work, we justify the validity of ramp loss under the classical kernel learning framework. In particular, we show that the generalization bound for empirical ramp risk minimizer is similar to that of convex surrogate losses, which implies kernel learning with such loss function is not only noise-resistant but, more importantly, statistically consistent. For adapting to real-time data streams, we introduce PA-ramp, a heuristic online algorithm based on the passive-aggressive framework, to solve this learning problem. Empirically, with fewer support vectors, this algorithm achieves robust empirical performances on tested noisy scenarios.

中文翻译:

具有非凸斜坡损失的核学习

我们研究了斜坡损失的核学习问题,斜坡损失是一种非凸但抗噪声的损失函数。在这项工作中,我们证明了经典核学习框架下斜坡损失的有效性。特别是,我们表明经验斜坡风险最小化器的泛化界类似于凸代理损失的泛化界,这意味着具有这种损失函数的核学习不仅抗噪声,而且更重要的是,具有统计一致性。为了适应实时数据流,我们引入了基于被动攻击框架的启发式在线算法 PA-ramp 来解决这个学习问题。从经验上看,由于支持向量较少,该算法在经过测试的嘈杂场景中实现了稳健的经验性能。
更新日期:2022-06-08
down
wechat
bug