当前位置: X-MOL 学术Appl. Soft Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Optimal uncertainty-guided neural network training
Applied Soft Computing ( IF 8.7 ) Pub Date : 2020-11-10 , DOI: 10.1016/j.asoc.2020.106878
H. M. Dipu Kabir , Abbas Khosravi , Abdollah Kavousi-Fard , Saeid Nahavandi , Dipti Srinivasan

The neural network (NN)-based direct uncertainty quantification (UQ) methods have achieved the state of the art performance since the first inauguration, known as the lower–upper-bound estimation (LUBE) method. However, currently-available cost functions for uncertainty guided NN training are not always converging, and all converged NNs do not generate optimized prediction intervals (PIs). In recent years researchers have proposed different quality criteria for PIs that raise a question about their relative effectiveness. Most of the existing cost functions of uncertainty guided NN training are not customizable, and the convergence of the NN training is uncertain. Therefore, in this paper, we propose a highly customizable smooth cost function for developing NNs to construct optimal PIs. The method computes the optimized average width of PIs, PI-failure distances, and the PI coverage probability (PICP) for the test dataset. We examine the performance of the proposed method for wind power generation, electricity demand, and temperature forecast datasets. Results show that the proposed method reduces variation in the quality of PIs, accelerates the training, and improves convergence probability from 99.2% to 99.8%.



中文翻译:

最优不确定性指导的神经网络训练

自从首次就职典礼以来,基于神经网络(NN)的直接不确定性量化(UQ)方法就已经达到了最先进的性能,即上下限估计(LUBE)方法。但是,不确定性指导的NN训练的当前可用成本函数并不总是收敛,并且所有收敛的NN都不会生成优化的预测间隔(PI)。近年来,研究人员提出了针对PI的不同质量标准,这引发了有关其相对有效性的问题。不确定性指导的NN训练的大多数现有成本函数都是不可定制的,并且NN训练的收敛性是不确定的。因此,在本文中,我们提出了一种高度可定制的平滑成本函数,用于开发神经网络以构建最佳PI。该方法计算PI的优化平均宽度,PI失效距离以及测试数据集的PI覆盖概率(PICP)。我们检查了风力发电,电力需求和温度预报数据集的建议方法的性能。结果表明,所提出的方法减少了PI的质量变化,加快了训练速度,并将收敛概率从99.2%提高到了99.8%。

更新日期:2020-11-12
down
wechat
bug