当前位置: X-MOL 学术Int. Stat. Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Horseshoe Regularisation for Machine Learning in Complex and Deep Models
International Statistical Review ( IF 1.7 ) Pub Date : 2020-01-29 , DOI: 10.1111/insr.12360
Anindya Bhadra 1 , Jyotishka Datta 2 , Yunfan Li 1 , Nicholas Polson 3
Affiliation  

Since the advent of the horseshoe priors for regularization, global-local shrinkage methods have proved to be a fertile ground for the development of Bayesian methodology in machine learning, specifically for high-dimensional regression and classification problems. They have achieved remarkable success in computation, and enjoy strong theoretical support. Most of the existing literature has focused on the linear Gaussian case; see Bhadra et al. (2019b) for a systematic survey. The purpose of the current article is to demonstrate that the horseshoe regularization is useful far more broadly, by reviewing both methodological and computational developments in complex models that are more relevant to machine learning applications. Specifically, we focus on methodological challenges in horseshoe regularization in nonlinear and non-Gaussian models; multivariate models; and deep neural networks. We also outline the recent computational developments in horseshoe shrinkage for complex models along with a list of available software implementations that allows one to venture out beyond the comfort zone of the canonical linear regression problems.

中文翻译:

复杂和深度模型中机器学习的马蹄正则化

自从用于正则化的马蹄形先验出现以来,全局-局部收缩方法已被证明是机器学习中贝叶斯方法发展的沃土,特别是对于高维回归和分类问题。他们在计算方面取得了显着的成功,并享有强有力的理论支持。大多数现有文献都集中在线性高斯情况;见 Bhadra 等人。(2019b) 进行系统调查。当前文章的目的是通过回顾与机器学习应用程序更相关的复杂模型中的方法论和计算发展,证明马蹄正则化在更广泛的范围内有用。具体来说,我们专注于非线性和非高斯模型中马蹄正则化的方法论挑战;多元模型;和深度神经网络。我们还概述了复杂模型的马蹄形收缩的最新计算发展,以及一系列可用的软件实现,允许人们冒险超越规范线性回归问题的舒适区。
更新日期:2020-01-29
down
wechat
bug