当前位置: X-MOL 学术Int. Stat. Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Brief Survey of Modern Optimization for Statisticians
International Statistical Review ( IF 1.7 ) Pub Date : 2014-02-17 , DOI: 10.1111/insr.12022
Kenneth Lange 1 , Eric C Chi 2 , Hua Zhou 3
Affiliation  

Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include nondifferentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience.

中文翻译:


统计学家现代优化的简要概述



现代计算统计学越来越转向高维优化来处理海量的大数据。一旦建立了模型,就可以通过优化来估计其参数。由于模型简约性很重要,因此模型通常包含不可微分的惩罚项,例如套索。这种清醒的现实使最小化和最大化变得复杂。我们的广泛调查强调了算法设计中的一些重要原则。与其孤立地看待这些原则,将它们混合搭配会更有成效。一些精心挑选的例子说明了这一点。还强调算法推导,淡化理论,特别是凸微积分的抽象。因此,我们的调查应该对广大受众有用且易于理解。
更新日期:2014-02-17
down
wechat
bug