当前位置: X-MOL 学术SIAM/ASA J. Uncertain. Quantif. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the Improved Rates of Convergence for Matérn-Type Kernel Ridge Regression with Application to Calibration of Computer Models
SIAM/ASA Journal on Uncertainty Quantification ( IF 2 ) Pub Date : 2020-12-08 , DOI: 10.1137/19m1304222
Rui Tuo , Yan Wang , C. F. Jeff Wu

SIAM/ASA Journal on Uncertainty Quantification, Volume 8, Issue 4, Page 1522-1547, January 2020.
Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions under which the actual rates of convergence of the kernel ridge regression estimator under both the $L_2$ norm and the norm of the reproducing kernel Hilbert space exceed the standard minimax rates. An application of this theory leads to a new understanding of the Kennedy--O'Hagan approach [J. R. Stat. Soc. Ser. B. Stat. Methodol., 63 (2001), pp. 425--464] for calibrating model parameters of computer simulation. We prove that, under certain conditions, the Kennedy--O'Hagan calibration estimator with a known covariance function converges to the minimizer of the norm of the residual function in the reproducing kernel Hilbert space.


中文翻译:

Matérn型核岭回归的改进收敛速度及其在计算机模型校准中的应用

SIAM / ASA不确定性量化期刊,第8卷,第4期,第1522-1547页,2020年1月。
核岭回归是一种重要的非参数方法,用于估计平滑函数。我们引入了一组新条件,在这些条件下,在$ L_2 $范数和可再生内核Hilbert空间范数下的内核岭回归估计量的实际收敛速率超过标准minimax速率。这一理论的应用导致对肯尼迪-奥哈根方法的新认识[JR Stat。Soc。老师 统计 [Methodol。,63(2001),第425--464页],用于校准计算机仿真的模型参数。我们证明,在某些条件下,具有已知协方差函数的Kennedy-O'Hagan校准估计量会收敛到再现核Hilbert空间中残差函数范数的极小值。
更新日期:2020-12-08
down
wechat
bug