当前位置: X-MOL 学术J. Phys. G Nucl. Partic. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Extrapolating from neural network models: a cautionary tale
Journal of Physics G: Nuclear and Particle Physics ( IF 3.4 ) Pub Date : 2021-06-22 , DOI: 10.1088/1361-6471/abf08a
A Pastore 1 , M Carnini 2
Affiliation  

We present three different methods to estimate error bars on the predictions made using a neural network (NN). All of them represent lower bounds for the extrapolation errors. At first, we illustrate the methods through a simple toy model, then, we apply them to some realistic case related to nuclear masses. By using theoretical data simulated either with a liquid-drop model or a Skyrme energy density functional, we benchmark the extrapolation performance of the NN in regions of the Segr chart far away from the ones used for the training and validation. Finally, we discuss how error bars can help identifying when the extrapolation becomes too uncertain and thus not reliable.



中文翻译:

从神经网络模型推断:一个警示故事

我们提出了三种不同的方法来估计使用神经网络 (NN) 所做预测的误差线。它们都代表外推误差的下限。首先,我们通过一个简单的玩具模型来说明这些方法,然后,我们将它们应用到一些与核质量相关的现实案例中。通过使用使用液滴模型或 Skyrme 能量密度函数模拟的理论数据,我们在 Segr 图的远离用于训练和验证的区域的区域中对 NN 的外推性能进行了基准测试。最后,我们讨论误差线如何帮助确定外推何时变得过于不确定从而不可靠。

更新日期:2021-06-22
down
wechat
bug