当前位置: X-MOL 学术J. Comput. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning to differentiate
Journal of Computational Physics ( IF 3.8 ) Pub Date : 2020-09-25 , DOI: 10.1016/j.jcp.2020.109873
Oskar Ålund , Gianluca Iaccarino , Jan Nordström

Artificial neural networks together with associated computational libraries provide a powerful framework for constructing both classification and regression algorithms. In this paper we use neural networks to design linear and non-linear discrete differential operators. We show that neural network based operators can be used to construct stable discretizations of initial boundary-value problems by ensuring that the operators satisfy a discrete analogue of integration-by-parts known as summation-by-parts. Our neural network approach with linear activation functions is compared and contrasted with a more traditional linear algebra approach. An application to overlapping grids is explored. The strategy developed in this work opens the door for constructing stable differential operators on general meshes.



中文翻译:

学习差异化

人工神经网络以及相关的计算库为构建分类和回归算法提供了强大的框架。在本文中,我们使用神经网络来设计线性和非线性离散微分算子。我们展示了基于神经网络的算子,可以通过确保算子满足零件积分的离散模拟(称为零件总和)来构造初始边值问题的稳定离散。我们将具有线性激活函数的神经网络方法与更传统的线性代数方法进行了比较和对比。探索了重叠网格的应用。这项工作中开发的策略为在通用网格上构造稳定的微分算子打开了大门。

更新日期:2020-10-02
down
wechat
bug