当前位置: X-MOL 学术J. Comput. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A probabilistic generative model for semi-supervised training of coarse-grained surrogates and enforcing physical constraints through virtual observables
Journal of Computational Physics ( IF 4.1 ) Pub Date : 2021-02-22 , DOI: 10.1016/j.jcp.2021.110218
Maximilian Rixner , Phaedon-Stelios Koutsourelakis

The data-centric construction of inexpensive surrogates for fine-grained, physical models has been at the forefront of computational physics due to its significant utility in many-query tasks such as uncertainty quantification. Recent efforts have taken advantage of the enabling technologies from the field of machine learning (e.g., deep neural networks) in combination with simulation data. While such strategies have shown promise even in higher-dimensional problems, they generally require large amounts of training data even though the construction of surrogates is by definition a small data problem. Rather than employing data-based loss functions, it has been proposed to make use of the governing equations (in the simplest case, at collocation points) in order to imbue domain knowledge in the training of the otherwise black-box-like interpolators. The present paper provides a flexible, probabilistic framework that accounts for physical structure and information both in the training objectives as well as in the surrogate model itself. We advocate a probabilistic (Bayesian) model in which equalities that are available from the physics (e.g., residuals, conservation laws) can be introduced as virtual observables and can provide additional information through the likelihood. We further advocate a generative model i.e. one that attempts to learn the joint density of inputs and outputs that is capable of making use of unlabeled data (i.e., only inputs) in a semi-supervised fashion in order to reveal lower-dimensional embeddings of the high-dimensional input which are nevertheless predictive of the fine-grained model's output.



中文翻译:

概率生成模型,用于半监督代理的粗粒度代理,并通过虚拟可观察对象加强物理约束

由于其在许多查询任务(例如不确定性量化)中的重要作用,因此用于细粒度物理模型的廉价代理的以数据为中心的构造一直处于计算物理学的前沿。最近的努力已经利用了来自机器学习领域的支持技术(例如,深度神经网络)与仿真数据的结合。尽管这样的策略即使在更高维度的问题上也显示出了希望,但它们的构建通常需要大量的训练数据,即使从定义上说,替代对象的构建是一个小数据问题。代替使用基于数据的损失函数,已经提出利用控制方程(在最简单的情况下,在并置点处),以便将域知识灌输到训练否则为黑盒子的内插器中。本文提供了一个灵活的概率框架,该框架考虑了培训目标以及替代模型本身中的物理结构和信息。我们提倡概率(贝叶斯)模型,其中可以从物理学中获得的等式(例如,残差,守恒定律)可以作为虚拟可观测物引入,并可以通过可能性来提供附加信息。我们进一步提倡一种生成模型,即一种试图学习输入和输出的联合密度的模型,该模型能够以半监督方式利用未标记的数据(即仅输入),以揭示低维嵌入。高维输入,但可以预测细粒度模型的输出。

更新日期:2021-02-26
down
wechat
bug