当前期刊: SIAM Journal on Optimization Go to current issue    加入关注   
显示样式:        排序: IF: - GO 导出
我的关注
我的收藏
您暂时未登录!
登录
  • Convergence Rates of Damped Inertial Dynamics under Geometric Conditions and Perturbations
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-07-06
    O. Sebbouh; Ch. Dossal; A. Rondepierre

    SIAM Journal on Optimization, Volume 30, Issue 3, Page 1850-1877, January 2020. In this article a family of second-order ODEs associated with the inertial gradient descent is studied. These ODEs are widely used to build trajectories converging to a minimizer $x^*$ of a function $F$, possibly convex. This family includes the continuous version of the Nesterov inertial scheme and the continuous heavy

    更新日期:2020-07-07
  • Inexact Sequential Quadratic Optimization with Penalty Parameter Updates within the QP Solver
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-07-02
    James V. Burke; Frank E. Curtis; Hao Wang; Jiashan Wang

    SIAM Journal on Optimization, Volume 30, Issue 3, Page 1822-1849, January 2020. This paper focuses on the design of sequential quadratic optimization (commonly known as SQP) methods for solving large-scale nonlinear optimization problems. The most computationally demanding aspect of such an approach is the computation of the search direction during each iteration, for which we consider the use of matrix-free

    更新日期:2020-07-07
  • Revisiting EXTRA for Smooth Distributed Optimization
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-07-01
    Huan Li; Zhouchen Lin

    SIAM Journal on Optimization, Volume 30, Issue 3, Page 1795-1821, January 2020. EXTRA is a popular method for dencentralized distributed optimization and has broad applications. This paper revisits EXTRA. First, we give a sharp complexity analysis for EXTRA with the improved $O\big(\big(\frac{L}{\mu}+\frac{1}{1-\sigma_2({W})}\big)\log\frac{1}{\epsilon(1-\sigma_2({W}))}\big)$ communication and computation

    更新日期:2020-07-01
  • Local Minimizers of Semi-Algebraic Functions from the Viewpoint of Tangencies
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-07-01
    Tien-Son Pham

    SIAM Journal on Optimization, Volume 30, Issue 3, Page 1777-1794, January 2020. Consider a semialgebraic function $f\colon\mathbb{R}^n \to {\mathbb{R}},$ which is continuous around a point $\bar{x} \in \mathbb{R}^n.$ Using the so-called tangency variety of $f$ at $\bar{x},$ we first provide necessary and sufficient conditions for $\bar{x}$ to be a local minimizer of $f,$ and then in the case where

    更新日期:2020-07-01
  • Several Classes of Stationary Points for Rank Regularized Minimization Problems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-29
    Yulan Liu; Shujun Bi; Shaohua Pan

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1756-1775, January 2020. For the rank regularized minimization problem, we introduce several classes of stationary points by the problem itself and its equivalent reformulations including the mathematical program with an equilibrium constraint (MPEC), the global exact penalty of the MPEC, and the difference-of-convex surrogate yielded by eliminating

    更新日期:2020-06-30
  • Subgradients of Marginal Functions in Parametric Control Problems of Partial Differential Equations
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-25
    Nguyen Thanh Qui; Daniel Wachsmuth

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1724-1755, January 2020. This paper studies generalized differentiability properties of the marginal function of parametric optimal control problems governed by semilinear elliptic partial differential equations. We establish some upper estimates for the regular and the limiting subgradients of the marginal function for Hilbert parametric spaces

    更新日期:2020-06-30
  • MultiComposite Nonconvex Optimization for Training Deep Neural Networks
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-18
    Ying Cui; Ziyu He; Jong-Shi Pang

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1693-1723, January 2020. We present in this paper a novel deterministic algorithmic framework that enables the computation of a directional stationary solution of the empirical deep neural network training problem formulated as a multicomposite optimization problem with coupled nonconvexity and nondifferentiability. This is the first time to our

    更新日期:2020-06-30
  • Primal-Dual Stochastic Gradient Method for Convex Programs with Many Functional Constraints
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-18
    Yangyang Xu

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1664-1692, January 2020. The stochastic gradient method (SGM) has been popularly applied to solve optimization problems with an objective that is stochastic or an average of many functions. Most existing works on SGMs assume that the underlying problem is unconstrained or has an easy-to-project constraint set. In this paper, we consider problems

    更新日期:2020-06-30
  • An Inverse-Adjusted Best Response Algorithm for Nash Equilibria
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-18
    Francesco Caruso; Maria Carmela Ceparano; Jacqueline Morgan

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1638-1663, January 2020. Regarding the approximation of Nash equilibria in games where the players have a continuum of strategies, there exist various algorithms based on best response dynamics and on its relaxed variants: from one step to the next, a player's strategy is updated by using explicitly a best response to the strategies of the other

    更新日期:2020-06-30
  • A New Sequential Optimality Condition for Constrained Nonsmooth Optimization
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-16
    Elias S. Helou; Sandra A. Santos; Lucas E. A. Simo͂es

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1610-1637, January 2020. We introduce a sequential optimality condition for locally Lipschitz constrained nonsmooth optimization, verifiable just using derivative information, and which holds even in the absence of any constraint qualification. We present a practical algorithm that generates iterates either fulfilling the new necessary optimality

    更新日期:2020-06-30
  • Strong Convex Nonlinear Relaxations of the Pooling Problem
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-11
    James Luedtke; Claudia D'Ambrosio; Jeff Linderoth; Jonas Schweiger

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1582-1609, January 2020. We investigate new convex relaxations for the pooling problem, a classic nonconvex production planning problem in which input materials are mixed in intermediate pools, with the outputs of these pools further mixed to make output products meeting given attribute percentage requirements. Our relaxations are derived by considering

    更新日期:2020-06-30
  • Adjusting Dual Iterates in the Presence of Critical Lagrange Multipliers
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-08
    Andreas Fischer; Alexey F. Izmailov; Wladimir Scheck

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1555-1581, January 2020. It is a well-known phenomenon that the presence of critical Lagrange multipliers in constrained optimization problems may cause a deterioration of the convergence speed of primal-dual Newton-type methods. Regardless of the method under consideration, we develop a new local technique for avoiding convergence to critical

    更新日期:2020-06-30
  • Exponential Decay in the Sensitivity Analysis of Nonlinear Dynamic Programming
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-06-08
    Sen Na; Mihai Anitescu

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1527-1554, January 2020. In this paper, we study the sensitivity of discrete-time dynamic programs with nonlinear dynamics and objective to perturbations in the initial conditions and reference parameters. Under uniform controllability and boundedness assumptions for the problem data, we prove that the directional derivative of the optimal state

    更新日期:2020-06-30
  • Robust Optimality and Duality in Multiobjective Optimization Problems under Data Uncertainty
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-27
    Thai Doan Chuong

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1501-1526, January 2020. In this paper, we employ advanced techniques of variational analysis and generalized differentiation to examine robust optimality conditions and robust duality for an uncertain nonsmooth multiobjective optimization problem under arbitrary uncertainty nonempty sets. We establish necessary and sufficient optimality conditions

    更新日期:2020-06-30
  • On the Adaptivity of Stochastic Gradient-Based Optimization
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-27
    Lihua Lei; Michael I. Jordan

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1473-1500, January 2020. Stochastic gradient-based optimization has been a core enabling methodology in applications to large-scale problems in machine learning and related areas. Despite this progress, the gap between theory and practice remains significant, with theoreticians pursuing mathematical optimality at the cost of obtaining specialized

    更新日期:2020-06-30
  • A Forward-Backward Splitting Method for Monotone Inclusions Without Cocoercivity
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-21
    Yura Malitsky; Matthew K. Tam

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1451-1472, January 2020. In this work, we propose a simple modification of the forward-backward splitting method for finding a zero in the sum of two monotone operators. Our method converges under the same assumptions as Tseng's forward-backward-forward method, namely, it does not require cocoercivity of the single-valued operator. Moreover, each

    更新日期:2020-06-30
  • On the Convergence of Stochastic Gradient Descent for Nonlinear Ill-Posed Problems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-19
    Bangti Jin; Zehui Zhou; Jun Zou

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1421-1450, January 2020. In this work, we analyze the regularizing property of the stochastic gradient descent for the numerical solution of a class of nonlinear ill-posed inverse problems in Hilbert spaces. At each step of the iteration, the method randomly chooses one equation from the nonlinear system to obtain an unbiased stochastic estimate

    更新日期:2020-06-30
  • Inertial, Corrected, Primal-Dual Proximal Splitting
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-14
    Tuomo Valkonen

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1391-1420, January 2020. We study inertial versions of primal-dual proximal splitting, also known as the Chambolle--Pock method. Our starting point is the preconditioned proximal point formulation of this method. By adding correctors corresponding to the antisymmetric part of the relevant monotone operator, using a FISTA-style gap unrolling argument

    更新日期:2020-06-30
  • A Proximal Average for Prox-Bounded Functions
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-12
    J. Chen; X. Wang; C. Planiden

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1366-1390, January 2020. We construct a proximal average for two prox-bounded functions, which recovers the classical proximal average for two convex functions. The new proximal average transforms continuously in epi-topology from one proximal hull to the other. When one of the functions is differentiable, the new proximal average is differentiable

    更新日期:2020-06-30
  • Using Two-Dimensional Projections for Stronger Separation and Propagation of Bilinear Terms
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-11
    Benjamin Müller; Felipe Serrano; Ambros Gleixner

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1339-1365, January 2020. One of the most fundamental ingredients in mixed-integer nonlinear programming solvers is the well-known McCormick relaxation for a product of two variables $x$ and $y$ over a box-constrained domain. The starting point of this paper is the fact that the convex hull of the graph of $xy$ can be much tighter when computed

    更新日期:2020-06-30
  • Pathological Subgradient Dynamics
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-05-04
    Aris Daniilidis; Dmitriy Drusvyatskiy

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1327-1338, January 2020. We construct examples of Lipschitz continuous functions, with pathological subgradient dynamics in both continuous and discrete time. In both settings, the iterates generate bounded trajectories and yet fail to detect any (generalized) critical points of the function.

    更新日期:2020-06-30
  • Spectral Properties of Barzilai--Borwein Rules in Solving Singly Linearly Constrained Optimization Problems Subject to Lower and Upper Bounds
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-29
    Serena Crisci; Federica Porta; Valeria Ruggiero; Luca Zanni

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1300-1326, January 2020. In 1988, Barzilai and Borwein published a pioneering paper which opened the way to inexpensively accelerate first-order. In more detail, in the framework of unconstrained optimization, Barzilai and Borwein developed two strategies to select the step length in gradient descent methods with the aim of encoding some second-order

    更新日期:2020-06-30
  • A Subgradient-Based Approach for Finding the Maximum Feasible Subsystem with Respect to a Set
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-28
    Minglu Ye; Ting Kei Pong

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1274-1299, January 2020. We propose a subgradient-based method for finding the maximum feasible subsystem in a collection of closed sets with respect to a given closed set $C$ (MFS$_C$). In this method, we reformulate the MFS$_C$ problem as an $\ell_0$ optimization problem and construct a sequence of continuous optimization problems to approximate

    更新日期:2020-06-30
  • A Geometrical Analysis on Convex Conic Reformulations of Quadratic and Polynomial Optimization Problems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-28
    Sunyoung Kim; Masakazu Kojima; Kim-Chuan Toh

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1251-1273, January 2020. We present a unified geometrical analysis on the completely positive programming (CPP) reformulations of quadratic optimization problems (QOPs) and their extension to polynomial optimization problems (POPs) based on a class of geometrically defined nonconvex conic programs and their convexification. The class of nonconvex

    更新日期:2020-06-30
  • Exact Converging Bounds for Stochastic Dual Dynamic Programming via Fenchel Duality
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-28
    Vincent Leclère; Pierre Carpentier; Jean-Philippe Chancelier; Arnaud Lenoir; François Pacaud

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1223-1250, January 2020. The stochastic dual dynamic programming (SDDP) algorithm has become one of the main tools used to address convex multistage stochastic optimal control problems. Recently a large amount of work has been devoted to improving the convergence speed of the algorithm through cut selection and regularization, and to extending

    更新日期:2020-06-30
  • Distributed Algorithms with Finite Data Rates that Solve Linear Equations
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-28
    Jinlong Lei; Peng Yi; Guodong Shi; Brian D. O. Anderson

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1191-1222, January 2020. In this paper, we study network linear equations subject to digital communications with a finite data rate, where each node is associated with one equation from a system of linear equations. Each node holds a dynamic state and interacts with its neighbors through an undirected connected graph, where along each link the

    更新日期:2020-06-30
  • Subset Selection in Sparse Matrices
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-21
    Alberto Del Pia; Santanu S. Dey; Robert Weismantel

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1173-1190, January 2020. In subset selection we search for the best linear predictor that involves a small subset of variables. From a computational complexity viewpoint, subset selection is NP-hard and few classes are known to be solvable in polynomial time. Using mainly tools from discrete geometry, we show that some sparsity conditions on the

    更新日期:2020-06-30
  • On Stochastic and Deterministic Quasi-Newton Methods for Nonstrongly Convex Optimization: Asymptotic Convergence and Rate Analysis
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-16
    Farzad Yousefian; Angelia Nedić; Uday V. Shanbhag

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1144-1172, January 2020. Motivated by applications arising from large-scale optimization and machine learning, we consider stochastic quasi-Newton (SQN) methods for solving unconstrained convex optimization problems. Much of the convergence analysis of SQN methods, in both full and limited-memory regimes, requires the objective function to be strongly

    更新日期:2020-04-16
  • Scenario Approach for Minmax Optimization with Emphasis on the Nonconvex Case: Positive Results and Caveats
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-14
    Mishal Assif; Debasish Chatterjee; Ravi Banavar

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1119-1143, January 2020. We treat the so-called scenario approach, a popular probabilistic approximation method for robust minmax optimization problems via independent and identically distributed (i.i.d.) sampling from the uncertainty set, from various perspectives. The scenario approach is well studied in the important case of convex robust optimization

    更新日期:2020-04-14
  • Duality Gap Estimation via a Refined Shapley--Folkman Lemma
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-14
    Yingjie Bi; Ao Tang

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1094-1118, January 2020. Based on concepts like the $k$th convex hull and finer characterization of nonconvexity of a function, we propose a refinement of the Shapley--Folkman lemma and derive a new estimate for the duality gap of nonconvex optimization problems with separable objective functions. We apply our result to the network utility maximization

    更新日期:2020-04-14
  • A Shifted Primal-Dual Penalty-Barrier Method for Nonlinear Optimization
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-06
    Philip E. Gill; Vyacheslav Kungurtsev; Daniel P. Robinson

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1067-1093, January 2020. In nonlinearly constrained optimization, penalty methods provide an effective strategy for handling equality constraints, while barrier methods provide an effective approach for the treatment of inequality constraints. A new algorithm for nonlinear optimization is proposed based on minimizing a shifted primal-dual penalty-barrier

    更新日期:2020-04-06
  • A Data-Independent Distance to Infeasibility for Linear Conic Systems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-04-02
    Javier Pen͂a; Vera Roshchina

    SIAM Journal on Optimization, Volume 30, Issue 2, Page 1049-1066, January 2020. We offer a unified treatment of distinct measures of well-posedness for homogeneous conic systems. To that end, we introduce a distance to infeasibility based entirely on geometric considerations of the elements defining the conic system. Our approach sheds new light on and connects several well-known condition measures

    更新日期:2020-04-02
  • Erratum: A Faster Algorithm Solving a Generalization of Isotonic Median Regression and a Class of Fused Lasso Problems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-30
    Dorit S. Hochbaum; Cheng Lu

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 1048-1048, January 2020. On page 2568 of our paper A faster algorithm solving a generalization of isotonic median regression and a class of fused lasso problems, we stated the following: \beginquote Note that Kolmogorov, Pock, and Rolinek in Total variation on a tree also claimed an $O(n\log \log n)$ algorithm for the PL-wFL-O(1) problem. However

    更新日期:2020-03-30
  • Limitations on the Expressive Power of Convex Cones without Long Chains of Faces
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-30
    James Saunderson

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 1033-1047, January 2020. A convex optimization problem in conic form involves minimizing a linear functional over the intersection of a convex cone and an affine subspace. In some cases, it is possible to replace a conic formulation using a certain cone, with a “lifted” conic formulation using another cone that is higher-dimensional, but simpler

    更新日期:2020-03-30
  • Projective Cutting-Planes
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-26
    Daniel Porumbel

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 1007-1032, January 2020. Given a polytope $\mathcal P$, an interior point ${x}\in\mathcal P$, and a direction ${d}\in\mathbb{R}^n$, the projection of ${x}$ along ${d}$ asks to find the maximum step length $t^*$ such that ${x}+t^*{d}\in\mathcal P$; we say ${x}+t^*{d}$ is the pierce point obtained by projection. In [D. Porumbel, Math. Program., 155

    更新日期:2020-03-26
  • The Boosted Difference of Convex Functions Algorithm for Nonsmooth Functions
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-23
    Francisco J. Aragón Artacho; Phan T. Vuong

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 980-1006, January 2020. The boosted difference of convex functions algorithm (BDCA) was recently proposed for minimizing smooth difference of convex (DC) functions. BDCA accelerates the convergence of the classical difference of convex functions algorithm (DCA) thanks to an additional line search step. The purpose of this paper is twofold. First

    更新日期:2020-03-23
  • A Single Timescale Stochastic Approximation Method for Nested Stochastic Optimization
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-17
    Saeed Ghadimi; Andrzej Ruszczyński; Mengdi Wang

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 960-979, January 2020. We study constrained nested stochastic optimization problems in which the objective function is a composition of two smooth functions whose exact values and derivatives are not available. We propose a single timescale stochastic approximation algorithm, which we call the nested averaged stochastic approximation (NASA), to

    更新日期:2020-03-17
  • A Distributed Flexible Delay-Tolerant Proximal Gradient Algorithm
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-17
    Konstantin Mishchenko; Franck Iutzeler; Jérôme Malick

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 933-959, January 2020. We develop and analyze an asynchronous algorithm for distributed convex optimization when the objective can be written as a sum of smooth functions, local to each worker, and a nonsmooth function. Unlike many existing methods, our distributed algorithm is adjustable to various levels of communication cost, delays, machines'

    更新日期:2020-03-17
  • A Linear-Time Algorithm for Generalized Trust Region Subproblems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-12
    Rujun Jiang; Duan Li

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 915-932, January 2020. In this paper, we provide the first provable linear-time (in terms of the number of nonzero entries of the input) algorithm for approximately solving the generalized trust region subproblem (GTRS) of minimizing a quadratic function over a quadratic constraint under some regularity condition. Our algorithm is motivated by

    更新日期:2020-03-12
  • Complexity and Approximability of Optimal Resource Allocation and Nash Equilibrium over Networks
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-12
    S. Rasoul Etesami

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 885-914, January 2020. Motivated by emerging resource allocation and data placement problems such as web caches and peer-to-peer systems, we consider and study a class of resource allocation problems over a network of agents (nodes). In this model, which can be viewed as a homogeneous data placement problem, nodes can store only a limited number

    更新日期:2020-03-12
  • Calculus Identities for Generalized Simplex Gradients: Rules and Applications
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-10
    Warren Hare; Gabriel Jarry-Bolduc

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 853-884, January 2020. Simplex gradients, essentially the gradient of a linear approximation, are a popular tool in derivative-free optimization (DFO). In 2015, a product rule, a quotient rule, and a sum rule for simplex gradients were introduced by Regis [Optim. Lett., 9 (2015), pp. 845--865]. Unfortunately, those calculus rules only work under

    更新日期:2020-03-10
  • Asymptotic Results of Stochastic Decomposition for Two-Stage Stochastic Quadratic Programming
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-03
    Junyi Liu; Suvrajeet Sen

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 823-852, January 2020. This paper presents the stochastic decomposition (SD) algorithms for two classes of stochastic programming problems: (1) two-stage stochastic quadratic-linear programming (SQLP) in which a quadratic program defines the objective function in the first stage and a linear program defines the value function in the second stage

    更新日期:2020-03-03
  • High Degree Sum of Squares Proofs, Bienstock--Zuckerberg Hierarchy, and Chvátal--Gomory Cuts
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-03
    Monaldo Mastrolilli

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 798-822, January 2020. Chvátal--Gomory cuts (CG-cuts) and the Bienstock--Zuckerberg hierarchy capture useful linear programs that the standard bounded degree sum-of-squares (SoS) hierarchy fails to capture. In this paper we present a novel polynomial time SoS hierarchy for 0/1 problems with a custom subspace of high degree polynomials (not the

    更新日期:2020-03-03
  • Exact Augmented Lagrangian Duality for Mixed Integer Quadratic Programming
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-03-03
    Xiaoyi Gu; Shabbir Ahmed; Santanu S. Dey

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 781-797, January 2020. Mixed integer quadratic programming (MIQP) is the problem of minimizing a quadratic function over mixed integer points in a rational polyhedron. This paper focuses on the augmented Lagrangian dual (ALD) for MIQP. ALD augments the usual Lagrangian dual with a weighted nonlinear penalty on the dualized constraints. We first

    更新日期:2020-03-03
  • Stability and Error Analysis for Optimization and Generalized Equations
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-27
    Johannes O. Royset

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 752-780, January 2020. Stability and error analysis remain challenging for problems that lack regularity properties near solutions, are subject to large perturbations, and might be infinite-dimensional. We consider nonconvex optimization and generalized equations defined on metric spaces and develop bounds on solution errors using the truncated

    更新日期:2020-02-27
  • Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-27
    Necdet Serhat Aybat; Alireza Fallah; Mert Gürbüzbalaban; Asuman Ozdaglar

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 717-751, January 2020. We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm. We focus on gradient descent and accelerated gradient (AG) methods for minimizing strongly convex functions when the gradient has random errors in the form of additive white noise. With gradient errors

    更新日期:2020-02-27
  • On the Convergence of Mirror Descent beyond Stochastic Convex Programming
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-27
    Zhengyuan Zhou; Panayotis Mertikopoulos; Nicholas Bambos; Stephen P. Boyd; Peter W. Glynn

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 687-716, January 2020. In this paper, we examine the convergence of mirror descent in a class of stochastic optimization problems that are not necessarily convex (or even quasi-convex) and which we call variationally coherent. Since the standard technique of “ergodic averaging” offers no tangible benefits beyond convex programming, we focus directly

    更新日期:2020-02-27
  • Nonconvex Robust Low-Rank Matrix Recovery
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-25
    Xiao Li; Zhihui Zhu; Anthony Man-Cho So; René Vidal

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 660-686, January 2020. In this paper, we study the problem of recovering a low-rank matrix from a number of random linear measurements that are corrupted by outliers taking arbitrary values. We consider a nonsmooth nonconvex formulation of the problem, in which we explicitly enforce the low-rank property of the solution by using a factored representation

    更新日期:2020-02-25
  • Spectral Operators of Matrices: Semismoothness and Characterizations of the Generalized Jacobian
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-20
    Chao Ding; Defeng Sun; Jie Sun; Kim-Chuan Toh

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 630-659, January 2020. Spectral operators of matrices proposed recently in [C. Ding, D. F. Sun, J. Sun, and K. C. Toh, Math. Program., 168 (2018), pp. 509--531] are a class of matrix-valued functions, which map matrices to matrices by applying a vector-to-vector function to all eigenvalues/singular values of the underlying matrices. Spectral operators

    更新日期:2020-02-20
  • Covering on a Convex Set in the Absence of Robinson's Regularity
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-20
    Aram V. Arutyunov; Alexey F. Izmailov

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 604-629, January 2020. We study stability properties of a given solution of a constrained equation, where the constraint has the form of the inclusion into an arbitrary closed convex set. We are mostly interested in those cases when Robinson's regularity condition does not hold, and we obtain weaker conditions ensuring stability of a given solution

    更新日期:2020-02-20
  • Critical Cones for Sufficient Second Order Conditions in PDE Constrained Optimization
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-20
    Eduardo Casas; Mariano Mateos

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 585-603, January 2020. In this paper, we analyze optimal control problems governed by semilinear parabolic equations. Box constraints for the controls are imposed, and the cost functional involves the state and possibly a sparsity-promoting term, but not a Tikhonov regularization term. Unlike finite dimensional optimization or control problems

    更新日期:2020-02-20
  • Convergence Analysis of the Relaxed Douglas--Rachford Algorithm
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-20
    D. Russell Luke; Anna-Lena Martins

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 542-584, January 2020. Motivated by nonconvex, inconsistent feasibility problems in imaging, the relaxed alternating averaged reflections algorithm, or relaxed Douglas--Rachford algorithm (DR$\lambda$), was first proposed over a decade ago. Convergence results for this algorithm are limited to either convex feasibility or consistent nonconvex feasibility

    更新日期:2020-02-20
  • Sharp Worst-Case Evaluation Complexity Bounds for Arbitrary-Order Nonconvex Optimization with Inexpensive Constraints
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-20
    Coralia Cartis; Nicholas I. M. Gould; Philippe L. Toint

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 513-541, January 2020. We provide sharp worst-case evaluation complexity bounds for nonconvex minimization problems with general inexpensive constraints, i.e., problems where the cost of evaluating/enforcing of the (possibly nonconvex or even disconnected) constraints, if any, is negligible compared to that of evaluating the objective function

    更新日期:2020-02-20
  • Well-Posed Solvability of Convex Optimization Problems on a Differentiable or Continuous Closed Convex Set
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    Xi Yin Zheng

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 490-512, January 2020. Given a closed convex set A in a Banach space X, this paper considers the continuity and differentiability of A. The continuity of a closed convex set was introduced and studied by Gale and Klee [Math. Scand., 7 (1959), pp. 370--391] in terms of its support functional, and the differentiability of a closed convex set is a

    更新日期:2020-02-06
  • Payoffs-Beliefs Duality and the Value of Information
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    Michel De Lara; Olivier Gossner

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 464-489, January 2020. In decision problems under incomplete information, actions (identified to payoff vectors indexed by states of nature) and beliefs are naturally paired by bilinear duality. We exploit this duality to analyze the value of information, using concepts and tools from convex analysis. We define the value function as the support

    更新日期:2020-02-06
  • Principal Component Analysis by Optimization of Symmetric Functions has no Spurious Local Optima
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    Armin Eftekhari; Raphael A. Hauser

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 439-463, January 2020. Principal component analysis (PCA) finds the best linear representation of data and is an indispensable tool in many learning and inference tasks. Classically, principal components of a dataset are interpreted as the directions that preserve most of its “energy,” an interpretation that is theoretically underpinned by the

    更新日期:2020-02-06
  • Inexact Cuts in Stochastic Dual Dynamic Programming
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    Vincent Guigues

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 407-438, January 2020. We introduce an extension of stochastic dual dynamic programming (SDDP) to solve stochastic convex dynamic programming equations. This extension applies when some or all primal and dual subproblems to be solved along the forward and backward passes of the method are solved with bounded errors (inexactly). This inexact variant

    更新日期:2020-02-06
  • Risk-Averse Models in Bilevel Stochastic Linear Programming
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    Johanna Burtscheidt; Matthias Claus; Stephan Dempe

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 377-406, January 2020. We consider a two-stage stochastic bilevel linear program where the leader contemplates the follower's reaction at the second stage optimistically. In this setting, the leader's objective function value can be modeled by a random variable, which we evaluate based on some law-invariant (quasi-)convex risk measure. After establishing

    更新日期:2020-02-06
  • A Stochastic Line Search Method with Expected Complexity Analysis
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    Courtney Paquette; Katya Scheinberg

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 349-376, January 2020. For deterministic optimization, line search methods augment algorithms by providing stability and improved efficiency. Here we adapt a classical backtracking Armijo line search to the stochastic optimization setting. While traditional line search relies on exact computations of the gradient and values of the objective function

    更新日期:2020-02-06
  • Existence of Lagrange Multipliers under Gâteaux Differentiable Data with Applications to Stochastic Optimal Control Problems
    SIAM J. Optim. (IF 2.247) Pub Date : 2020-02-06
    A. Jourani; F. J. Silva

    SIAM Journal on Optimization, Volume 30, Issue 1, Page 319-348, January 2020. The main objective of this work is to study the existence of Lagrange multipliers for infinite dimensional problems under Gâteaux differentiability assumptions on the data. Our investigation follows two main steps: the proof of the existence of Lagrange multipliers under a calmness assumption on the constraints and the study

    更新日期:2020-02-06
Contents have been reproduced by permission of the publishers.
导出
全部期刊列表>>
材料学研究精选
Springer Nature Live 产业与创新线上学术论坛
胸腔和胸部成像专题
自然科研论文编辑服务
ACS ES&T Engineering
ACS ES&T Water
屿渡论文,编辑服务
杨超勇
周一歌
华东师范大学
南京工业大学
清华大学
中科大
唐勇
跟Nature、Science文章学绘图
隐藏1h前已浏览文章
中洪博元
课题组网站
新版X-MOL期刊搜索和高级搜索功能介绍
ACS材料视界
x-mol收录
福州大学
南京大学
王杰
左智伟
湖南大学
清华大学
吴杰
赵延川
中山大学化学工程与技术学院
试剂库存
天合科研
down
wechat
bug