当前位置: X-MOL 学术Nat. Commun. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stable machine-learning parameterization of subgrid processes for climate modeling at a range of resolutions.
Nature Communications ( IF 14.7 ) Pub Date : 2020-07-03 , DOI: 10.1038/s41467-020-17142-3
Janni Yuval 1 , Paul A O'Gorman 1
Affiliation  

Global climate models represent small-scale processes such as convection using subgrid models known as parameterizations, and these parameterizations contribute substantially to uncertainty in climate projections. Machine learning of new parameterizations from high-resolution model output is a promising approach, but such parameterizations have been prone to issues of instability and climate drift, and their performance for different grid spacings has not yet been investigated. Here we use a random forest to learn a parameterization from coarse-grained output of a three-dimensional high-resolution idealized atmospheric model. The parameterization leads to stable simulations at coarse resolution that replicate the climate of the high-resolution simulation. Retraining for different coarse-graining factors shows the parameterization performs best at smaller horizontal grid spacings. Our results yield insights into parameterization performance across length scales, and they also demonstrate the potential for learning parameterizations from global high-resolution simulations that are now emerging.



中文翻译:

在各种分辨率下,用于气候建模的子网格过程的稳定机器学习参数化。

全球气候模型代表小规模过程,例如使用称为参数化的子网格模型的对流,这些参数化极大地影响了气候预测的不确定性。从高分辨率模型输出中对新参数化进行机器学习是一种很有前途的方法,但是此类参数化容易出现不稳定和气候漂移的问题,并且尚未研究它们在不同网格间距下的性能。在这里,我们使用随机森林从三维高分辨率理想化大气模型的粗粒度输出中学习参数化。通过参数化,可以在粗分辨率下进行稳定的模拟,从而复制高分辨率模拟的环境。对不同粗粒度因子的重新训练表明,参数化在较小的水平网格间距下效果最佳。我们的结果提供了跨长度标度的参数化性能的见解,它们还展示了从新兴的全球高分辨率模拟中学习参数化的潜力。

更新日期:2020-07-03
down
wechat
bug