当前位置: X-MOL 学术arXiv.cs.NA › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Robust Least Squares for Quantized Data Matrices
arXiv - CS - Numerical Analysis Pub Date : 2020-03-26 , DOI: arxiv-2003.12004
Richard Clancy and Stephen Becker

In this paper we formulate and solve a robust least squares problem for a system of linear equations subject to quantization error in the data matrix. Ordinary least squares fails to consider uncertainty in the operator, modeling all noise in the observed signal. Total least squares accounts for uncertainty in the data matrix, but necessarily increases the condition number of the operator compared to ordinary least squares. Tikhonov regularization or ridge regression is frequently employed to combat ill-conditioning, but requires parameter tuning which presents a host of challenges and places strong assumptions on parameter prior distributions. The proposed method also requires selection of a parameter, but it can be chosen in a natural way, e.g., a matrix rounded to the 4th digit uses an uncertainty bounding parameter of 0.5e-4. We show here that our robust method is theoretically appropriate, tractable, and performs favorably against ordinary and total least squares.

中文翻译:

量化数据矩阵的稳健最小二乘法

在本文中,我们为受数据矩阵中量化误差影响的线性方程组制定并解决了鲁棒最小二乘问题。普通最小二乘法没有考虑算子中的不确定性,对观测信号中的所有噪声进行建模。总最小二乘法考虑了数据矩阵中的不确定性,但与普通最小二乘法相比,必然会增加算子的条件数。Tikhonov 正则化或岭回归经常用于对抗病态,但需要参数调整,这带来了许多挑战,并对参数先验分布提出了强有力的假设。所提出的方法还需要选择一个参数,但它可以以自然的方式选择,例如,四舍五入到第 4 位的矩阵使用 0.5e-4 的不确定性边界参数。
更新日期:2020-06-25
down
wechat
bug