当前位置: X-MOL 学术Stat. Pap. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The generalized equivalence of regularization and min–max robustification in linear mixed models
Statistical Papers ( IF 1.2 ) Pub Date : 2021-01-11 , DOI: 10.1007/s00362-020-01214-z
Jan Pablo Burgard , Joscha Krause , Dennis Kreber , Domingo Morales

The connection between regularization and min–max robustification in the presence of unobservable covariate measurement errors in linear mixed models is addressed. We prove that regularized model parameter estimation is equivalent to robust loss minimization under a min–max approach. On the example of the LASSO, Ridge regression, and the Elastic Net, we derive uncertainty sets that characterize the feasible noise that can be added to a given estimation problem. These sets allow us to determine measurement error bounds without distribution assumptions. A conservative Jackknife estimator of the mean squared error in this setting is proposed. We further derive conditions under which min-max robust estimation of model parameters is consistent. The theoretical findings are supported by a Monte Carlo simulation study under multiple measurement error scenarios.

中文翻译:

线性混合模型中正则化和最小-最大鲁棒化的广义等价

解决了在线性混合模型中存在不可观察的协变量测量误差时正则化和最小-最大稳健性之间的联系。我们证明了正则化模型参数估计等效于最小-最大方法下的稳健损失最小化。在 LASSO、岭回归和弹性网络的示例中,我们推导出不确定性集,这些不确定性集表征可以添加到给定估计问题的可行噪声。这些集合允许我们在没有分布假设的情况下确定测量误差范围。提出了在此设置中均方误差的保守 Jackknife 估计器。我们进一步推导出模型参数的最小-最大稳健估计一致的条件。
更新日期:2021-01-11
down
wechat
bug