当前位置: X-MOL 学术Neural Comput. & Applic. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Online Bayesian shrinkage regression
Neural Computing and Applications ( IF 6 ) Pub Date : 2020-05-27 , DOI: 10.1007/s00521-020-04947-y
Waqas Jamil , Abdelhamid Bouchachia

The present work introduces an original and new online regression method that extends the shrinkage via limit of Gibbs sampler (SLOG) in the context of online learning. In particular, we theoretically show how the proposed online SLOG (OSLOG) is obtained using the Bayesian framework without resorting to the Gibbs sampler or considering a hierarchical representation. Moreover, in order to define the performance guarantee of OSLOG, we derive an upper bound on the cumulative squared loss. It is the only online regression algorithm with sparsity that gives logarithmic regret. Furthermore, we do an empirical comparison with two state-of-the-art algorithms to illustrate the performance of OSLOG relying on three aspects: normality, sparsity and multicollinearity showing an excellent achievement of trade-off between these properties.



中文翻译:

在线贝叶斯收缩回归

本工作介绍了一种新颖的在线回归方法,该方法通过在线学习的情况下通过Gibbs采样器(SLOG)的极限来扩展收缩。特别是,我们从理论上说明了如何在不借助Gibbs采样器或不考虑层次表示的情况下使用贝叶斯框架获得建议的在线SLOG(OSLOG)。此外,为了定义OSLOG的性能保证,我们推导了累积平方损耗的上限。它是唯一具有对数遗憾的稀疏在线回归算法。此外,我们对两种最先进的算法进行了经验比较,以说明OSLOG的性能依赖于三个方面:正态性,稀疏性和多重共线性,显示了这些特性之间的良好折衷。

更新日期:2020-05-27
down
wechat
bug