当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Distributed and Quantized Online Multi-Kernel Learning
IEEE Transactions on Signal Processing ( IF 5.4 ) Pub Date : 2021-09-27 , DOI: 10.1109/tsp.2021.3115357
Yanning Shen , Saeed Karimi-Bidhendi , Hamid Jafarkhani

Kernel-basedlearning has well-documented merits in various machine learning tasks. Most of the kernel-based learning approaches rely on a pre-selected kernel, the choice of which presumes task-specific prior information. In addition, most existing frameworks assume that data are collected centrally at batch. Such a setting may not be feasible especially for large-scale data sets that are collected sequentially over a network. To cope with these challenges, the present work develops an online multi-kernel learning scheme to infer the intended nonlinear function ‘on the fly’ from data samples that are collected in distributed locations. To address communication efficiency among distributed nodes, we study the effects of quantization and develop a distributed and quantized online multiple kernel learning algorithm. We provide regret analysis that indicates our algorithm is capable of achieving sublinear regret. Numerical tests on real datasets show the effectiveness of our algorithm.

中文翻译:

分布式和量化的在线多核学习

基于内核的学习在各种机器学习任务中都有充分证明的优点。大多数基于内核的学习方法依赖于预先选择的内核,其选择假定特定于任务的先验信息。此外,大多数现有框架都假设数据是批量集中收集的。这种设置可能不可行,尤其是对于通过网络顺序收集的大规模数据集。为了应对这些挑战,目前的工作开发了一种在线多核学习方案,以从分布式位置收集的数据样本中“动态”推断出预期的非线性函数。为了解决分布式节点之间的通信效率问题,我们研究了量化的影响并开发了一种分布式量化的在线多核学习算法。我们提供遗憾分析,表明我们的算法能够实现次线性遗憾。对真实数据集的数值测试显示了我们算法的有效性。
更新日期:2021-10-19
down
wechat
bug