当前位置: X-MOL 学术Optim. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sample average approximation for stochastic nonconvex mixed integer nonlinear programming via outer-approximation
Optimization and Engineering ( IF 2.0 ) Pub Date : 2020-09-26 , DOI: 10.1007/s11081-020-09563-2
Can Li , David E. Bernal , Kevin C. Furman , Marco A. Duran , Ignacio E. Grossmann

We propose a sample average approximation-based outer-approximation algorithm (SAAOA) that can address nonconvex two-stage stochastic programs (SP) with any continuous or discrete probability distributions. Previous work has considered this approach for convex two-stage SP (Wei and Realff in Comput Chem Eng 28(3):333–346, 2004). The SAAOA algorithm does internal sampling within a nonconvex outer-approximation algorithm where we iterate between a mixed-integer linear programming (MILP) master problem and a nonconvex nonlinear programming (NLP) subproblem. We prove that the optimal solutions and optimal value obtained by the SAAOA algorithm converge to the optimal solutions and the optimal value of the true SP problem as the sample size goes to infinity. The convergence rate is also given to estimate the sample size. Since the theoretical sample size estimate is too conservative in practice, we propose an SAAOA algorithm with confidence intervals for the upper bound and the lower bound at each iteration of the SAAOA algorithm. Two policies are proposed to update the sample sizes dynamically within the SAAOA algorithm with confidence intervals. The proposed algorithm works well for the special case of pure binary first stage variables and continuous stage two variables since in this case the nonconvex NLPs can be solved for each scenario independently. The proposed algorithm is tested with a stochastic pooling problem and is shown to outperform the external sampling approach where large scale MINLPs need to be solved.



中文翻译:

外部近似的随机非凸混合整数非线性规划的样本平均逼近

我们提出了一种基于样本平均近似的外部近似算法(SAAOA),该算法可以解决具有任何连续或离散概率分布的非凸两阶段随机程序(SP)。先前的工作已经针对凸两级SP考虑了这种方法(Wei和Realff in Comput Chem Eng 28(3):333–346,2004)。SAAOA算法在非凸外部逼近算法中进行内部采样,在该算法中,我们在混合整数线性规划(MILP)主问题和非凸非线性规划(NLP)子问题之间进行迭代。我们证明,当样本量达到无穷大时,由SAAOA算法获得的最优解和最优值收敛到真实SP问题的最优解和最优值。还给出收敛速度以估计样本量。由于理论上的样本量估计在实践中过于保守,因此我们提出了一种SAAOA算法,该算法在SAAOA算法的每次迭代中均具有上界和下界的置信区间。提出了两种策略来以置信区间在SAAOA算法内动态更新样本大小。提出的算法在纯二进制第一级变量和连续级两个变量的特殊情况下效果很好,因为在这种情况下,非凸NLP可以针对每种情况独立求解。所提出的算法通过随机池问题进行了测试,并显示出优于需要解决大规模MINLP的外部采样方法。我们提出了一种SAAOA算法,该算法在SAAOA算法的每次迭代时都具有上下限的置信区间。提出了两种策略以置信区间在SAAOA算法内动态更新样本大小。该算法对于纯二进制第一阶段变量和连续阶段两个变量的特殊情况非常有效,因为在这种情况下,非凸NLP可以针对每种情况独立求解。所提出的算法通过随机池问题进行了测试,并显示出优于需要解决大规模MINLP的外部采样方法。我们提出了一种SAAOA算法,该算法在SAAOA算法的每次迭代时都具有上下限的置信区间。提出了两种策略以置信区间在SAAOA算法内动态更新样本大小。提出的算法在纯二进制第一级变量和连续级两个变量的特殊情况下效果很好,因为在这种情况下,非凸NLP可以针对每种情况独立求解。所提出的算法通过随机池问题进行了测试,并显示出优于需要解决大规模MINLP的外部采样方法。提出的算法在纯二进制第一级变量和连续级两个变量的特殊情况下效果很好,因为在这种情况下,非凸NLP可以针对每种情况独立求解。所提出的算法通过随机池问题进行了测试,并显示出优于需要解决大规模MINLP的外部采样方法。提出的算法在纯二进制第一级变量和连续级两个变量的特殊情况下效果很好,因为在这种情况下,非凸NLP可以针对每种情况独立求解。所提出的算法通过随机池问题进行了测试,并显示出优于需要解决大规模MINLP的外部采样方法。

更新日期:2020-09-26
down
wechat
bug