当前位置: X-MOL 学术Comput. Methods Appl. Mech. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Recurrent neural network plasticity models: Unveiling their common core through multi-task learning
Computer Methods in Applied Mechanics and Engineering ( IF 7.2 ) Pub Date : 2024-04-24 , DOI: 10.1016/j.cma.2024.116991
Julian N. Heidenreich , Dirk Mohr

Recurrent neural network models are known to be particularly suitable for data-driven constitutive modeling due to their built-in memory variables. The main challenge preventing their widespread application to engineering materials lies in their excessive need of data for training. Here, we postulate that RNN models of elasto-plastic solids feature a large common core that is shared by all materials of the same class. The common core is complemented by material-specific layers with parameters that vary from material-to-material. After training RNN models for 25 different von Mises materials, we identify the common core of a multi-task model. Furthermore, we demonstrate through ensemble transfer learning that adding a new material to the multi-task model requires datasets that are two to three orders of magnitude smaller than the datasets needed for training an RNN from scratch. In addition, to multi-task learning, we introduce probabilistic ensembles of RNN plasticity models to quantify the epistemic uncertainty. A deep drawing simulation is performed to demonstrate the superior generalization capabilities of RNNs identified via multi-task learning as compared to those obtained through single task training.

中文翻译:

循环神经网络可塑性模型:通过多任务学习揭示其共同核心

众所周知,循环神经网络模型由于其内置的内存变量而特别适合数据驱动的本构建模。阻碍它们广泛应用于工程材料的主要挑战在于它们对训练数据的过度需求。在这里,我们假设弹塑性固体的 RNN 模型具有一个由同类所有材料共享的大型公共核心。公共核心由特定材料层补充,其参数因材料而异。在针对 25 种不同的 von Mises 材料训练 RNN 模型后,我们确定了多任务模型的共同核心。此外,我们通过集成迁移学习证明,向多任务模型添加新材料所需的数据集比从头开始训练 RNN 所需的数据集小两到三个数量级。此外,对于多任务学习,我们引入了 RNN 可塑性模型的概率集成来量化认知不确定性。进行深拉模拟以证明通过多任务学习识别的 RNN 与通过单任务训练获得的 RNN 相比具有优越的泛化能力。
更新日期:2024-04-24
down
wechat
bug