当前位置: X-MOL 学术Annu. Rev. Stat. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Variational View on Statistical Multiscale Estimation
Annual Review of Statistics and Its Application ( IF 7.4 ) Pub Date : 2022-03-07 , DOI: 10.1146/annurev-statistics-040120-030531
Markus Haltmeier 1 , Housen Li 2, 3 , Axel Munk 2, 3
Affiliation  

We present a unifying view on various statistical estimation techniques including penalization, variational, and thresholding methods. These estimators are analyzed in the context of statistical linear inverse problems including nonparametric and change point regression, and high-dimensional linear models as examples. Our approach reveals many seemingly unrelated estimation schemes as special instances of a general class of variational multiscale estimators, called MIND (multiscale Nemirovskii–Dantzig). These estimators result from minimizing certain regularization functionals under convex constraints that can be seen as multiple statistical tests for local hypotheses. For computational purposes, we recast MIND in terms of simpler unconstraint optimization problems via Lagrangian penalization as well as Fenchel duality. Performance of several MINDs is demonstrated on numerical examples.

中文翻译:


统计多尺度估计的变分观点



我们对各种统计估计技术(包括惩罚、变分和阈值方法)提出了统一的观点。这些估计量在统计线性反问题的背景下进行分析,包括非参数和变点回归以及高维线性模型作为示例。我们的方法揭示了许多看似不相关的估计方案,作为一类变分多尺度估计器的特殊实例,称为 MIND(多尺度 Nemirovskii-Datzig)。这些估计量是通过在凸约束下最小化某些正则化函数而产生的,可以将其视为局部假设的多重统计检验。出于计算目的,我们通过拉格朗日惩罚和 Fenchel 对偶性,根据更简单的无约束优化问题重新构建了 MIND。多个 MIND 的性能通过数值示例进行了演示。
更新日期:2022-03-07
down
wechat
bug