当前位置: X-MOL 学术Appl. Numer. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Large-Scale Regression with Non-Convex Loss and Penalty
Applied Numerical Mathematics ( IF 2.2 ) Pub Date : 2020-11-01 , DOI: 10.1016/j.apnum.2020.07.006
Alessandro Buccini , Omar De la Cruz Cabrera , Marco Donatelli , Andrea Martinelli , Lothar Reichel

Abstract We describe a computational method for parameter estimation in linear regression, that is capable of simultaneously producing sparse estimates and dealing with outliers and heavy-tailed error distributions. The method used is based on the image restoration method proposed in [G. Huang, A. Lanza, S. Morigi, L. Reichel, and F. Sgallari, Majorization-minimization generalized Krylov subspace methods for l p - l q optimization applied to image restoration, BIT Numer. Math., 57 (2017), pp. 351–378]. It can be applied to problems of arbitrary size. The choice of certain parameters is discussed. Results obtained for simulated and real data are presented.

中文翻译:

具有非凸损失和惩罚的大规模回归

摘要 我们描述了一种用于线性回归中参数估计的计算方法,该方法能够同时产生稀疏估计并处理异常值和重尾误差分布。所使用的方法基于[G. Huang、A. Lanza、S. Morigi、L. Reichel 和 F. Sgallari,用于图像恢复的 lp-lq 优化的 Majorization-minimization 广义 Krylov 子空间方法,BIT Numer。数学,57 (2017),第 351–378 页]。它可以应用于任意大小的问题。讨论了某些参数的选择。给出了模拟和真实数据的结果。
更新日期:2020-11-01
down
wechat
bug