当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Scalable Kernel Ordinal Regression via Doubly Stochastic Gradients
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.2 ) Pub Date : 2020-08-28 , DOI: 10.1109/tnnls.2020.3015937
Bin Gu , Xiang Geng , Xiang Li , Wanli Shi , Guansheng Zheng , Cheng Deng , Heng Huang

Ordinal regression (OR) is one of the most important machine learning tasks. The kernel method is a major technique to achieve nonlinear OR. However, traditional kernel OR solvers are inefficient due to increased complexity introduced by multiple ordinal thresholds as well as the cost of kernel computation. Doubly stochastic gradient (DSG) is a very efficient and scalable kernel learning algorithm that combines random feature approximation with stochastic functional optimization. However, the theory and algorithm of DSG can only support optimization tasks within the unique reproducing kernel Hilbert space (RKHS), which is not suitable for OR problems where the multiple ordinal thresholds usually lead to multiple RKHSs. To address this problem, we construct a kernel whose RKHS can contain the decision function with multiple thresholds. Based on this new kernel, we further propose a novel DSG-like algorithm, DSGOR. In each iteration of DSGOR, we update the decision functional as well as the function bias with appropriately set learning rates for each. Our theoretic analysis shows that DSGOR can achieve O(1/t) convergence rate, which is as good as DSG, even though dealing with a much harder problem. Extensive experimental results demonstrate that our algorithm is much more efficient than traditional kernel OR solvers, especially on large-scale problems.

中文翻译:


通过双随机梯度的可扩展核序数回归



序数回归(OR)是最重要的机器学习任务之一。核方法是实现非线性或的主要技术。然而,由于多个序数阈值带来的复杂性增加以及内核计算的成本,传统的内核 OR 求解器效率低下。双随机梯度(DSG)是一种非常高效且可扩展的核学习算法,它将随机特征近似与随机函数优化相结合。然而,DSG的理论和算法只能支持唯一的再生核希尔伯特空间(RKHS)内的优化任务,这不适用于多个序数阈值通常导致多个RKHS的OR问题。为了解决这个问题,我们构建了一个内核,其 RKHS 可以包含具有多个阈值的决策函数。基于这个新内核,我们进一步提出了一种新颖的类似 DSG 的算法 DSGOR。在 DSGOR 的每次迭代中,我们都会使用适当设置的学习率来更新决策函数和函数偏差。我们的理论分析表明,DSGOR 可以实现 O(1/t) 的收敛速度,与 DSG 一样好,尽管处理的问题要困难得多。大量的实验结果表明,我们的算法比传统的核 OR 求解器要高效得多,尤其是在大规模问题上。
更新日期:2020-08-28
down
wechat
bug