当前位置: X-MOL 学术Comput. Methods Appl. Mech. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Adaboost-based ensemble of polynomial chaos expansion with adaptive sampling
Computer Methods in Applied Mechanics and Engineering ( IF 7.2 ) Pub Date : 2021-11-01 , DOI: 10.1016/j.cma.2021.114238
Yicheng Zhou 1 , Zhenzhou Lu 2 , Kai Cheng 2
Affiliation  

In the study, we propose a new polynomial chaos expansion surrogate modeling method based on Adaboost (Adaboost-PCE) for uncertainty quantification. Adaboost is an ensemble learning technique originating from the machine learning field. The idea of Adaboost-PCE is to construct many times the “weak” PCE model with assigned weights to each sample point. Each time, the weight of a particular sample point in the training set depends on the performance of the surrogate models on that sample. In this way, weighted least-squares approximation is employed to exploit the weights of each sample in order to reduce the effect of outliers. The Adaboost-PCE is appealing since it is possible to estimate the ensemble weights without using any explicit error metrics as in most existent ensemble methods. Moreover, it has an expectation of the prediction error that enables the efficient adaptive sampling. The proposed method is validated with a numerical comparison of their performance on a series of numerical tests including partial differential equations with high-dimensional inputs.



中文翻译:

基于 Adaboost 的多项式混沌扩展与自适应采样的集成

在研究中,我们提出了一种新的基于 Adaboost(Adaboost-PCE)的多项式混沌扩展代理建模方法,用于不确定性量化。Adaboost 是一种起源于机器学习领域的集成学习技术。Adaboost-PCE 的思想是多次构建“弱”PCE 模型,并为每个样本点分配权重。每次,训练集中特定样本点的权重取决于代理模型对该样本的性能。通过这种方式,采用加权最小二乘近似来利用每个样本的权重,以减少异常值的影响。Adaboost-PCE 很有吸引力,因为它可以估计集成权重,而无需像大多数现有集成方法那样使用任何明确的误差度量。而且,它具有预测误差的期望,可以实现高效的自适应采样。所提出的方法通过一系列数值测试(包括具有高维输入的偏微分方程)的性能的数值比较来验证。

更新日期:2021-11-02
down
wechat
bug