当前位置: X-MOL 学术Found. Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning Elliptic Partial Differential Equations with Randomized Linear Algebra
Foundations of Computational Mathematics ( IF 2.5 ) Pub Date : 2022-01-18 , DOI: 10.1007/s10208-022-09556-w
Nicolas Boullé 1 , Alex Townsend 2
Affiliation  

Given input–output pairs of an elliptic partial differential equation (PDE) in three dimensions, we derive the first theoretically rigorous scheme for learning the associated Green’s function G. By exploiting the hierarchical low-rank structure of G, we show that one can construct an approximant to G that converges almost surely and achieves a relative error of \(\mathcal {O}(\varGamma _\epsilon ^{-1/2}\log ^3(1/\epsilon )\epsilon )\) using at most \(\mathcal {O}(\epsilon ^{-6}\log ^4(1/\epsilon ))\) input–output training pairs with high probability, for any \(0<\epsilon <1\). The quantity \(0<\varGamma _\epsilon \le 1\) characterizes the quality of the training dataset. Along the way, we extend the randomized singular value decomposition algorithm for learning matrices to Hilbert–Schmidt operators and characterize the quality of covariance kernels for PDE learning.



中文翻译:

用随机线性代数学习椭圆偏微分方程

给定三个维度的椭圆偏微分方程 (PDE) 的输入-输出对,我们推导出第一个理论上严格的方案来学习相关的格林函数G。通过利用G的分层低秩结构,我们表明可以构造G的近似值,该近似值几乎可以肯定地收敛并实现\(\mathcal {O}(\varGamma _\epsilon ^{-1/2 }\log ^3(1/\epsilon )\epsilon )\)最多使用\(\mathcal {O}(\epsilon ^{-6}\log ^4(1/\epsilon ))\)输入-输出对于任何\(0<\epsilon <1\),以高概率训练对。数量\(0<\varGamma _\epsilon \le 1\)表征训练数据集的质量。在此过程中,我们将学习矩阵的随机奇异值分解算法扩展到希尔伯特-施密特算子,并描述了用于 PDE 学习的协方差核的质量。

更新日期:2022-01-19
down
wechat
bug