当前位置: X-MOL 学术Comput. Mech. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Bi-fidelity stochastic gradient descent for structural optimization under uncertainty
Computational Mechanics ( IF 3.7 ) Pub Date : 2020-08-03 , DOI: 10.1007/s00466-020-01870-w
Subhayan De , Kurt Maute , Alireza Doostan

The presence of uncertainty in material properties and geometry of a structure is ubiquitous. The design of robust engineering structures, therefore, needs to incorporate uncertainty in the optimization process. Stochastic gradient descent (SGD) method can alleviate the cost of optimization under uncertainty, which includes statistical moments of quantities of interest in the objective and constraints. However, the design may change considerably during the initial iterations of the optimization process which impedes the convergence of the traditional SGD method and its variants. In this paper, we present two SGD based algorithms, where the computational cost is reduced by employing a low-fidelity model in the optimization process. In the first algorithm, most of the stochastic gradient calculations are performed on the low-fidelity model and only a handful of gradients from the high-fidelity model are used per iteration, resulting in an improved convergence. In the second algorithm, we use gradients from low-fidelity models to be used as control variate, a variance reduction technique, to reduce the variance in the search direction. These two bi-fidelity algorithms are illustrated first with a conceptual example. Then, the convergence of the proposed bi-fidelity algorithms is studied with two numerical examples of shape and topology optimization and compared to popular variants of the SGD method that do not use low-fidelity models. The results show that the proposed use of a bi-fidelity approach for the SGD method can improve the convergence. Two analytical proofs are also provided that show the linear convergence of these two algorithms under appropriate assumptions.

中文翻译:

不确定性下结构优化的双保真随机梯度下降

材料特性和结构几何形状的不确定性无处不在。因此,稳健工程结构的设计需要在优化过程中考虑不确定性。随机梯度下降 (SGD) 方法可以减轻不确定性下的优化成本,其中包括目标和约束中感兴趣数量的统计矩。然而,在优化过程的初始迭代期间,设计可能会发生很大变化,这会阻碍传统 SGD 方法及其变体的收敛。在本文中,我们提出了两种基于 SGD 的算法,其中通过在优化过程中采用低保真模型来降低计算成本。在第一个算法中,大多数随机梯度计算是在低保真模型上执行的,每次迭代仅使用来自高保真模型的少数梯度,从而提高了收敛性。在第二种算法中,我们使用来自低保真模型的梯度作为控制变量,一种方差减少技术,以减少搜索方向的方差。这两种双保真算法首先通过一个概念示例进行说明。然后,通过形状和拓扑优化的两个数值示例研究了所提出的双保真算法的收敛性,并与不使用低保真模型的 SGD 方法的流行变体进行了比较。结果表明,建议对 SGD 方法使用双保真方法可以提高收敛性。
更新日期:2020-08-03
down
wechat
bug