当前位置:
X-MOL 学术
›
IEEE Signal Proc. Mag.
›
论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Optimally Compressed Nonparametric Online Learning: Tradeoffs between memory and consistency
IEEE Signal Processing Magazine ( IF 14.9 ) Pub Date : 2020-05-01 , DOI: 10.1109/msp.2020.2973567 Alec Koppel , Amrit Singh Bedi , Ketan Rajawat , Brian M. Sadler
IEEE Signal Processing Magazine ( IF 14.9 ) Pub Date : 2020-05-01 , DOI: 10.1109/msp.2020.2973567 Alec Koppel , Amrit Singh Bedi , Ketan Rajawat , Brian M. Sadler
Batch training of machine learning models based on neural networks is well established, whereas, to date, streaming methods are largely based on linear models. To go beyond linear in the online setting, nonparametric methods are of interest due to their universality and ability to stably incorporate new information via convexity or Bayes's rule. Unfortunately, when applied online, nonparametric methods suffer a "curse of dimensionality," which precludes their use: their complexity scales at least with the time index. We survey online compression tools that bring their memory under control and attain approximate convergence. The asymptotic bias depends on a compression parameter that trades off memory and accuracy. Applications to robotics, communications, economics, and power are discussed as well as extensions to multiagent systems.
中文翻译:
最佳压缩非参数在线学习:内存和一致性之间的权衡
基于神经网络的机器学习模型的批量训练已经很成熟,而迄今为止,流方法主要基于线性模型。为了在在线设置中超越线性,非参数方法因其普遍性和通过凸性或贝叶斯规则稳定地合并新信息的能力而受到关注。不幸的是,当在线应用时,非参数方法遭受“维度诅咒”,这妨碍了它们的使用:它们的复杂性至少与时间指数成比例。我们调查了在线压缩工具,它们可以控制内存并实现近似收敛。渐近偏差取决于在内存和准确性之间进行权衡的压缩参数。讨论了机器人、通信、经济和电力的应用以及对多代理系统的扩展。
更新日期:2020-05-01
中文翻译:
最佳压缩非参数在线学习:内存和一致性之间的权衡
基于神经网络的机器学习模型的批量训练已经很成熟,而迄今为止,流方法主要基于线性模型。为了在在线设置中超越线性,非参数方法因其普遍性和通过凸性或贝叶斯规则稳定地合并新信息的能力而受到关注。不幸的是,当在线应用时,非参数方法遭受“维度诅咒”,这妨碍了它们的使用:它们的复杂性至少与时间指数成比例。我们调查了在线压缩工具,它们可以控制内存并实现近似收敛。渐近偏差取决于在内存和准确性之间进行权衡的压缩参数。讨论了机器人、通信、经济和电力的应用以及对多代理系统的扩展。