-
An Exact Algorithmic Framework for a Class of Mixed-Integer Programs with Equilibrium Constraints SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-19 Teodora Dan; Andrea Lodi; Patrice Marcotte
SIAM Journal on Optimization, Volume 31, Issue 1, Page 275-306, January 2021. In this study, we consider a rich class of mathematical programs with equilibrium constraints (MPECs) involving both integer and continuous variables. Such a class, which subsumes mathematical programs with complementarity constraints, as well as bilevel programs involving lower level convex programs is, in general, extremely
-
Convergence and Dynamical Behavior of the ADAM Algorithm for Nonconvex Stochastic Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-13 Anas Barakat; Pascal Bianchi
SIAM Journal on Optimization, Volume 31, Issue 1, Page 244-274, January 2021. Adam is a popular variant of stochastic gradient descent for finding a local minimizer of a function. In the constant stepsize regime, assuming that the objective function is differentiable and nonconvex, we establish the convergence in the long run of the iterates to a stationary point under a stability condition. The key
-
An Average Curvature Accelerated Composite Gradient Method for Nonconvex Smooth Composite Optimization Problems SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-13 Jiaming Liang; Renato D. C. Monteiro
SIAM Journal on Optimization, Volume 31, Issue 1, Page 217-243, January 2021. This paper presents an accelerated composite gradient (ACG) variant, referred to as the AC-ACG method, for solving nonconvex smooth composite minimization problems. As opposed to well-known ACG variants that are based on either a known Lipschitz gradient constant or a sequence of maximum observed curvatures, the current one
-
Distance-Sparsity Transference for Vertices of Corner Polyhedra SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-13 Iskander Aliev; Marcel Celaya; Martin Henk; Aled Williams
SIAM Journal on Optimization, Volume 31, Issue 1, Page 200-216, January 2021. We obtain a transference bound for vertices of corner polyhedra that connects two well-established areas of research: proximity and sparsity of solutions to integer programs. In the knapsack scenario, it implies that for any vertex ${x}^*$ of an integer feasible knapsack polytope ${P}({a}, { b})=\{{x} \in {\mathbb R}^n_{\ge
-
Convergence Analysis of Gradient Algorithms on Riemannian Manifolds without Curvature Constraints and Application to Riemannian Mass SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-12 Jinhua Wang; Xiangmei Wang; Chong Li; Jen-Chih Yao
SIAM Journal on Optimization, Volume 31, Issue 1, Page 172-199, January 2021. We study the convergence issue for the gradient algorithm (employing general step sizes) for optimization problems on general Riemannian manifolds (without curvature constraints). Under the assumption of the local convexity/quasi-convexity (resp., weak sharp minima), local/global convergence (resp., linear convergence) results
-
Spectral Relaxations and Branching Strategies for Global Optimization of Mixed-Integer Quadratic Programs SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-12 Carlos J. Nohra; Arvind U. Raghunathan; Nikolaos Sahinidis
SIAM Journal on Optimization, Volume 31, Issue 1, Page 142-171, January 2021. We consider the global optimization of nonconvex (mixed-integer) quadratic programs. We present a family of convex quadratic relaxations derived by convexifying nonconvex quadratic functions through perturbations of the quadratic matrix. We investigate the theoretical properties of these relaxations and show that they are
-
Chordal-TSSOS: A Moment-SOS Hierarchy That Exploits Term Sparsity with Chordal Extension SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-12 Jie Wang; Victor Magron; Jean-Bernard Lasserre
SIAM Journal on Optimization, Volume 31, Issue 1, Page 114-141, January 2021. This work is a follow-up and a complement to [J. Wang, V. Magron and J. B. Lasserre, preprint, arXiv:1912.08899, 2019] where the TSSOS hierarchy was proposed for solving polynomial optimization problems (POPs). The chordal-TSSOS hierarchy that we propose is a new sparse moment-SOS framework based on term sparsity and chordal
-
A Globally Convergent SQCQP Method for Multiobjective Optimization Problems SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-12 Md Abu Talhamainuddin Ansary; Geetanjali Panda
SIAM Journal on Optimization, Volume 31, Issue 1, Page 91-113, January 2021. In this article, the concept of the single-objective sequential quadratically constrained quadratic programming method is extended to the multiobjective case and a new line search technique is developed for nonlinear multiobjective optimization problems. The proposed method ensures global convergence as well as spreading of
-
Convergence of Newton-MR under Inexact Hessian Information SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-07 Yang Liu; Fred Roosta
SIAM Journal on Optimization, Volume 31, Issue 1, Page 59-90, January 2021. Recently, there has been a surge of interest in designing variants of the classical Newton-CG in which the Hessian of a (strongly) convex function is replaced by suitable approximations. This is mainly motivated by large-scale finite-sum minimization problems that arise in many machine learning applications. Going beyond convexity
-
TSSOS: A Moment-SOS Hierarchy That Exploits Term Sparsity SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-07 Jie Wang; Victor Magron; Jean-Bernard Lasserre
SIAM Journal on Optimization, Volume 31, Issue 1, Page 30-58, January 2021. This paper is concerned with polynomial optimization problems. We show how to exploit term (or monomial) sparsity of the input polynomials to obtain a new converging hierarchy of semidefinite programming relaxations. The novelty (and distinguishing feature) of such relaxations is to involve block-diagonal matrices obtained
-
An Interior-Point Approach for Solving Risk-Averse PDE-Constrained Optimization Problems with Coherent Risk Measures SIAM J. Optim. (IF 2.247) Pub Date : 2021-01-05 Sebastian Garreis; Thomas M. Surowiec; Michael Ulbrich
SIAM Journal on Optimization, Volume 31, Issue 1, Page 1-29, January 2021. The prevalence of uncertainty in models of engineering and the natural sciences necessitates the inclusion of random parameters in the underlying partial differential equations (PDEs). The resulting decision problems governed by the solution of such random PDEs are infinite dimensional stochastic optimization problems. In order
-
Generalized Subdifferentials of Spectral Functions over Euclidean Jordan Algebras SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-18 Bruno F. Lourenço; Akiko Takeda
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3387-3414, January 2020. This paper is devoted to the study of generalized subdifferentials of spectral functions over Euclidean Jordan algebras. Spectral functions appear often in optimization problems playing the role of “regularizer,” ``barrier,” ``penalty function,” and many others. We provide formulae for the regular, approximate, and horizon
-
Scalable Algorithms for the Sparse Ridge Regression SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-17 Weijun Xie; Xinwei Deng
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3359-3386, January 2020. Sparse regression and variable selection for large-scale data have been rapidly developed in the past decades. This work focuses on sparse ridge regression, which enforces the sparsity by use of the $L_{0}$ norm. We first prove that the continuous relaxation of the mixed integer second order conic (MISOC) reformulation
-
Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-14 Felix Lieder
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3345-3358, January 2020. Cubic regularization methods have several favorable properties. In particular under mild assumptions, they are globally convergent towards critical points with second-order necessary conditions satisfied. Their adoption among practitioners, however, does not yet match the strong theoretical results. One of the reasons for
-
Stochastic Conditional Gradient++: (Non)Convex Minimization and Continuous Submodular Maximization SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-14 Hamed Hassani; Amin Karbasi; Aryan Mokhtari; Zebang Shen
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3315-3344, January 2020. In this paper, we consider the general nonoblivious stochastic optimization where the underlying stochasticity may change during the optimization procedure and depends on the point at which the function is evaluated. We develop Stochastic Frank--Wolfe++ (SFW++), an efficient variant of the conditional gradient method for
-
Fair Packing and Covering on a Relative Scale SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-10 Jelena Diakonikolas; Maryam Fazel; Lorenzo Orecchia
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3284-3314, January 2020. Fair resource allocation is a fundamental optimization problem with applications in operations research, networking, and economic and game theory. Research in these areas has led to the general acceptance of a class of $\alpha$-fair utility functions parameterized by $\alpha \in [0, \infty]$. We consider $\alpha$-fair packing---the
-
Newton-like Inertial Dynamics and Proximal Algorithms Governed by Maximally Monotone Operators SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-03 Hedy Attouch; Szilárd Csaba László
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3252-3283, January 2020. The introduction of the Hessian damping in the continuous version of Nesterov's accelerated gradient method provides, by temporal discretization, fast proximal gradient algorithms where the oscillations are significantly attenuated. We will extend these results to the maximally monotone case. We rely on the technique introduced
-
Convergence Rate of $\mathcal{O}(1/k)$ for Optimistic Gradient and Extragradient Methods in Smooth Convex-Concave Saddle Point Problems SIAM J. Optim. (IF 2.247) Pub Date : 2020-12-03 Aryan Mokhtari; Asuman E. Ozdaglar; Sarath Pattathil
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3230-3251, January 2020. We study the iteration complexity of the optimistic gradient descent-ascent (OGDA) method and the extragradient (EG) method for finding a saddle point of a convex-concave unconstrained min-max problem. To do so, we first show that both OGDA and EG can be interpreted as approximate variants of the proximal point method.
-
Robust Spectral Risk Optimization When Information on Risk Spectrum Is Incomplete SIAM J. Optim. (IF 2.247) Pub Date : 2020-11-24 Wei Wang; Huifu Xu
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3198-3229, January 2020. A spectral risk measure (SRM) is a weighted average of value at risk where the weighting function (also known as risk spectrum or distortion function) characterizes a decision maker's risk attitude. In this paper, we consider the case where the decision maker's risk spectrum is ambiguous and introduce a robust SRM model
-
Globally Convergent Type-I Anderson Acceleration for Nonsmooth Fixed-Point Iterations SIAM J. Optim. (IF 2.247) Pub Date : 2020-11-18 Junzi Zhang; Brendan O'Donoghue; Stephen Boyd
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3170-3197, January 2020. We consider the application of the type-I Anderson acceleration to solving general nonsmooth fixed-point problems. By interleaving with safeguarding steps and employing a Powell-type regularization and a restart checking for strong linear independence of the updates, we propose the first globally convergent variant of Anderson
-
Topology of Pareto Sets of Strongly Convex Problems SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-28 Naoki Hamada; Kenta Hayano; Shunsuke Ichiki; Yutaro Kabata; Hiroshi Teramoto
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2659-2686, January 2020. A multiobjective optimization problem is simplicial if the Pareto set and front are homeomorphic to a simplex and, under the homeomorphisms, each face of the simplex corresponds to the Pareto set and front of a subproblem that treats a subset of objective functions. In this paper, we show that strongly convex problems are
-
Feasible Corrector-Predictor Interior-Point Algorithm for $P_{*} (\kappa)$-Linear Complementarity Problems Based on a New Search Direction SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-28 Zsolt Darvay; Tibor Illés; Janez Povh; Petra Renáta Rigó
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2628-2658, January 2020. We introduce a new feasible corrector-predictor (CP) interior-point algorithm (IPA), which is suitable for solving linear complementarity problem (LCP) with $P_{*} (\kappa)$-matrices. We use the method of algebraically equivalent transformation (AET) of the nonlinear equation of the system which defines the central path
-
A New Constraint Qualification and Sharp Optimality Conditions for Nonsmooth Mathematical Programming Problems in Terms of Quasidifferentials SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-24 M. V. Dolgopolik
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2603-2627, January 2020. The paper is devoted to an analysis of a new constraint qualification and a derivation of the strongest existing optimality conditions for nonsmooth mathematical programming problems with equality and inequality constraints in terms of Demyanov--Rubinov--Polyakova quasidifferentials under the minimal possible assumptions
-
Rank Optimality for the Burer--Monteiro Factorization SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-23 Irène Waldspurger; Alden Waters
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2577-2602, January 2020. When solving large-scale semidefinite programs that admit a low-rank solution, an efficient heuristic is the Burer--Monteiro factorization: instead of optimizing over the full matrix, one optimizes over its low-rank factors. This reduces the number of variables to optimize but destroys the convexity of the problem, thus
-
On the Behavior of the Douglas--Rachford Algorithm for Minimizing a Convex Function Subject to a Linear Constraint SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-23 Heinz H. Bauschke; Walaa M. Moursi
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2559-2576, January 2020. The Douglas--Rachford algorithm (DRA) is a powerful optimization method for minimizing the sum of two convex (not necessarily smooth) functions. The vast majority of previous research dealt with the case when the sum has at least one minimizer. In the absence of minimizers, it was recently shown that for the case of two
-
Two-Stage Stochastic Programming with Linearly Bi-parameterized Quadratic Recourse SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-21 Junyi Liu; Ying Cui; Jong-Shi Pang; Suvrajeet Sen
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2530-2558, January 2020. This paper studies the class of two-stage stochastic programs with a linearly bi-parameterized recourse function defined by a convex quadratic program. A distinguishing feature of this new class of nonconvex stochastic programs is that the objective function in the second stage is linearly parameterized by the first-stage
-
A Trust Region Method for Finding Second-Order Stationarity in Linearly Constrained Nonconvex Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-16 Maher Nouiehed; Meisam Razaviyayn
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2501-2529, January 2020. Motivated by the TRACE algorithm [F. E. Curtis, D. P. Robinson, and M. Samadi, Math. Program., 162 (2017), pp. 1--32], we propose a trust region algorithm for finding second-order stationary points of a linearly constrained nonconvex optimization problem. We show the convergence of the proposed algorithm to ($\epsilon_g
-
Active Set Complexity of the Away-Step Frank--Wolfe Algorithm SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-16 Immanuel M. Bomze; Francesco Rinaldi; Damiano Zeffiro
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2470-2500, January 2020. In this paper, we study active set identification results for the away-step Frank--Wolfe algorithm in different settings. We first prove a local identification property that we apply, in combination with a convergence hypothesis, to get an active set identification result. We then prove, for nonconvex objectives, a novel
-
On the Computation of Kantorovich--Wasserstein Distances Between Two-Dimensional Histograms by Uncapacitated Minimum Cost Flows SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-09 Federico Bassetti; Stefano Gualandi; Marco Veneroni
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2441-2469, January 2020. In this work, we present a method to compute the Kantorovich--Wasserstein distance of order 1 between a pair of two-dimensional histograms. Recent works in computer vision and machine learning have shown the benefits of measuring Wasserstein distances of order 1 between histograms with $n$ bins by solving a classical transportation
-
An Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear Programming SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-08 Xudong Li; Defeng Sun; Kim-Chuan Toh
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2410-2440, January 2020. Powerful interior-point methods (IPM) based commercial solvers, such as Gurobi and Mosek, have been hugely successful in solving large-scale linear programming (LP) problems. The high efficiency of these solvers depends critically on the sparsity of the problem data and advanced matrix factorization techniques. For a large
-
Contracting Proximal Methods for Smooth Convex Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-11-10 Nikita Doikov; Yurii Nesterov
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3146-3169, January 2020. In this paper, we propose new accelerated methods for smooth convex optimization, called contracting proximal methods. At every step of these methods, we need to minimize a contracted version of the objective function augmented by a regularization term in the form of Bregman divergence. We provide global convergence analysis
-
Solving Multiobjective Mixed Integer Convex Optimization Problems SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-29 Marianna De Santis; Gabriele Eichfelder; Julia Niebling; Stefan Rocktäschel
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3122-3145, January 2020. Multiobjective mixed integer convex optimization refers to mathematical programming problems where more than one convex objective function needs to be optimized simultaneously and some of the variables are constrained to take integer values. We present a branch-and-bound method based on the use of properly defined lower
-
Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-28 Yuxin Chen; Yuejie Chi; Jianqing Fan; Cong Ma; Yuling Yan
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3098-3121, January 2020. This paper studies noisy low-rank matrix completion: given partial and noisy entries of a large low-rank matrix, the goal is to estimate the underlying matrix faithfully and efficiently. Arguably one of the most popular paradigms to tackle this problem is convex relaxation, which achieves remarkable efficacy in practice
-
Convergence of Inexact Forward--Backward Algorithms Using the Forward--Backward Envelope SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-22 S. Bonettini; M. Prato; S. Rebegoldi
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3069-3097, January 2020. This paper deals with a general framework for inexact forward--backward algorithms aimed at minimizing the sum of an analytic function and a lower semicontinuous, subanalytic, convex term. Such a framework relies on an implementable inexactness condition for the computation of the proximal operator and on a linesearch procedure
-
Second-Order Guarantees of Distributed Gradient Algorithms SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-22 Amir Daneshmand; Gesualdo Scutari; Vyacheslav Kungurtsev
SIAM Journal on Optimization, Volume 30, Issue 4, Page 3029-3068, January 2020. We consider distributed smooth nonconvex unconstrained optimization over net- works, modeled as a connected graph. We examine the behavior of distributed gradient-based algorithms near strict saddle points. Specifically, we establish that (i) the renowned distributed gradient descent algorithm likely converges to a neighborhood
-
Approximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type Algorithms SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-19 Konstantin Usevich; Jianze Li; Pierre Comon
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2998-3028, January 2020. We propose a gradient-based Jacobi algorithm for a class of maximization problems on the unitary group, with a focus on approximate diagonalization of complex matrices and tensors by unitary transformations. We provide weak convergence results, and prove local linear convergence of this algorithm. The convergence results
-
The Convex Hull of a Quadratic Constraint over a Polytope SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-13 Asteroide Santana; Santanu S. Dey
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2983-2997, January 2020. A quadratically constrained quadratic program (QCQP) is an optimization problem in which the objective function is a quadratic function and the feasible region is defined by quadratic constraints. Solving nonconvex QCQP to global optimality is a well-known NP-hard problem and a traditional approach is to use convex relaxations
-
New Constraint Qualifications for Optimization Problems in Banach Spaces Based on Asymptotic KKT Conditions SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-08 Eike Börgens; Christian Kanzow; Patrick Mehlitz; Gerd Wachsmuth
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2956-2982, January 2020. Optimization theory in Banach spaces suffers from a lack of available constraint qualifications. There exist very few constraint qualifications, and these are often violated even in simple applications. This is very much in contrast to finite-dimensional nonlinear programs, where a large number of constraint qualifications
-
An Equivalence between Critical Points for Rank Constraints Versus Low-Rank Factorizations SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-08 Wooseok Ha; Haoyang Liu; Rina Foygel Barber
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2927-2955, January 2020. Two common approaches in low-rank optimization problems are either working directly with a rank constraint on the matrix variable or optimizing over a low-rank factorization so that the rank constraint is implicitly ensured. In this paper, we study the natural connection between the rank-constrained and factorized approaches
-
A Unified Adaptive Tensor Approximation Scheme to Accelerate Composite Convex Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-08 Bo Jiang; Tianyi Lin; Shuzhong Zhang
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2897-2926, January 2020. In this paper, we propose a unified two-phase scheme to accelerate any high-order regularized tensor approximation approach on the smooth part of a composite convex optimization model. The proposed scheme has the advantage of not needing to assume any prior knowledge of the Lipschitz constants for the gradient, the Hessian
-
Non-stationary First-Order Primal-Dual Algorithms with Faster Convergence Rates SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-07 Quoc Tran-Dinh; Yuzixuan Zhu
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2866-2896, January 2020. In this paper, we propose two novel non-stationary first-order primal-dual algorithms to solve non-smooth composite convex optimization problems. Unlike existing primal-dual schemes where the parameters are often fixed, our methods use predefined and dynamic sequences for parameters. We prove that our first algorithm can
-
Distributionally Robust Stochastic Dual Dynamic Programming SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-07 Daniel Duque; David P. Morton
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2841-2865, January 2020. We consider a multistage stochastic linear program that lends itself to solution by stochastic dual dynamic programming (SDDP). In this context, we consider a distributionally robust variant of the model with a finite number of realizations at each stage. Distributional robustness is with respect to the probability mass
-
Convex Analysis in $\mathbb{Z}^n$ and Applications to Integer Linear Programming SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-07 Jun Li; Giandomenico Mastroeni
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2809-2840, January 2020. In this paper, we compare the definitions of convex sets and convex functions in finite dimensional integer spaces introduced by Adivar and Fang, Borwein, and Giladi, respectively. We show that their definitions of convex sets and convex functions are equivalent. We also provide exact formulations for convex sets, convex
-
Randomized Gradient Boosting Machine SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-07 Haihao Lu; Rahul Mazumder
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2780-2808, January 2020. The Gradient Boosting Machine (GBM) introduced by Friedman [J. H. Friedman, Ann. Statist., 29 (2001), pp. 1189--1232] is a powerful supervised learning algorithm that is very widely used in practice---it routinely features as a leading algorithm in machine learning competitions such as Kaggle and the KDDCup. In spite of
-
Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-05 G. N. Grapiglia; Yu. Nesterov
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2750-2779, January 2020. In this paper, we study $p$-order methods for unconstrained minimization of convex functions that are $p$-times differentiable ($p\geq 2$) with $\nu$-Hölder continuous $p$th derivatives. We propose tensor schemes with and without acceleration. For the schemes without acceleration, we establish iteration complexity bounds
-
Stochastic Three Points Method for Unconstrained Smooth Minimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-01 El Houcine Bergou; Eduard Gorbunov; Peter Richtárik
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2726-2749, January 2020. In this paper we consider the unconstrained minimization problem of a smooth function in $\mathbb{R}^n$ in a setting where only function evaluations are possible. We design a novel randomized derivative-free algorithm---the stochastic three points (STP) method---and analyze its iteration complexity. At each iteration, STP
-
Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-10-01 Antonio Silveti-Falls; Cesare Molinari; Jalal Fadili
SIAM Journal on Optimization, Volume 30, Issue 4, Page 2687-2725, January 2020. In this paper we propose a splitting scheme which hybridizes the generalized conditional gradient with a proximal step and which we call the CGALP algorithm for minimizing the sum of three proper convex and lower-semicontinuous functions in real Hilbert spaces. The minimization is subject to an affine constraint, that,
-
Twice Epi-Differentiability of Extended-Real-Valued Functions with Applications in Composite Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-02 Ashkan Mohammadi; M. Ebrahim Sarabi
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2379-2409, January 2020. The paper is devoted to the study of the twice epi-differentiablity of extended-real-valued functions, with an emphasis on functions satisfying a certain composite representation. This will be conducted under parabolic regularity, a second-order regularity condition that was recently utilized in [A. Mohammadi, B. Mordukhovich
-
Facets of the Stochastic Network Flow Problem SIAM J. Optim. (IF 2.247) Pub Date : 2020-09-01 Alexander S. Estes; Michael O. Ball
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2355-2378, January 2020. We study a type of network flow problem that we call the minimum-cost F-graph flow problem. This problem generalizes the typical minimum-cost network flow problem by allowing the underlying network to be a directed hypergraph rather than a directed graph. This new problem is pertinent because it can be used to model network
-
Multiscale Analysis of Accelerated Gradient Methods SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-25 Mohammad Farazmand
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2337-2354, January 2020. Accelerated gradient descent iterations are widely used in optimization. It is known that, in the continuous-time limit, these iterations converge to a second-order differential equation which we refer to as the accelerated gradient flow. Using geometric singular perturbation theory, we show that, under certain conditions
-
A Privacy-Preserving Method to Optimize Distributed Resource Allocation SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-24 Olivier Beaude; Pascal Benchimol; Stéphane Gaubert; Paulin Jacquot; Nadia Oudjane
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2303-2336, January 2020. We consider a resource allocation problem involving a large number of agents with individual constraints subject to privacy, and a central operator whose objective is to optimize a global, possibly nonconvex, cost while satisfying the agents' constraints, for instance, an energy operator in charge of the management of energy
-
A Proximal Alternating Direction Method of Multiplier for Linearly Constrained Nonconvex Minimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-13 Jiawei Zhang; Zhi-Quan Luo
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2272-2302, January 2020. Consider the minimization of a nonconvex differentiable function over a bounded polyhedron. A popular primal-dual first-order method for this problem is to perform a gradient projection iteration for the augmented Lagrangian function and then update the dual multiplier vector using the constraint residual. However, numerical
-
Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-13 Ernest K. Ryu; Adrien B. Taylor; Carolina Bergeling; Pontus Giselsson
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2251-2271, January 2020. We propose a methodology for studying the performance of common splitting methods through semidefinite programming. We prove tightness of the methodology and demonstrate its value by presenting two applications of it. First, we use the methodology as a tool for computer-assisted proofs to prove tight analytical contraction
-
Solving Chance-Constrained Problems via a Smooth Sample-Based Nonlinear Approximation SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-12 Alejandra Peña-Ordieres; James R. Luedtke; Andreas Wächter
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2221-2250, January 2020. We introduce a new method for solving nonlinear continuous optimization problems with chance constraints. Our method is based on a reformulation of the probabilistic constraint as a quantile function. The quantile function is approximated via a differentiable sample average approximation. We provide theoretical statistical
-
A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-12 Yangjing Zhang; Ning Zhang; Defeng Sun; Kim-Chuan Toh
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2197-2220, January 2020. Undirected graphical models have been especially popular for learning the conditional independence structure among a large number of variables where the observations are drawn independently and identically from the same distribution. However, many modern statistical problems would involve categorical data or time-varying
-
A Nonsmooth Trust-Region Method for Locally Lipschitz Functions with Application to Optimization Problems Constrained by Variational Inequalities SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-12 Constantin Christof; Juan Carlos De los Reyes; Christian Meyer
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2163-2196, January 2020. We propose a general trust-region method for the minimization of nonsmooth and nonconvex, locally Lipschitz continuous functions that can be applied, e.g., to optimization problems constrained by elliptic variational inequalities. The convergence of the considered algorithm to C-stationary points is verified in an abstract
-
Finite Convergence of Proximal-Gradient Inertial Algorithms Combining Dry Friction with Hessian-Driven Damping SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-11 Samir Adly; Hedy Attouch
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2134-2162, January 2020. In a Hilbert space ${\mathcal H}$, we introduce a new class of proximal-gradient algorithms with finite convergence properties. These algorithms naturally occur as discrete temporal versions of an inertial differential inclusion which is stabilized under the joint action of three dampings: dry friction, viscous friction
-
Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-11 Yifan Hu; Xin Chen; Niao He
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2103-2133, January 2020. In this paper, we study a class of stochastic optimization problems, referred to as the conditional stochastic optimization (CSO), in the form of $\min_{x \in \mathcal{X}} \mathbb{E}_{\xi}f_\xi({\mathbb{E}_{\eta|\xi}[g_\eta(x, \xi)]})$, which finds a wide spectrum of applications including portfolio selection, reinforcement
-
Periodical Multistage Stochastic Programs SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-05 Alexander Shapiro; Lingquan Ding
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2083-2102, January 2020. In some applications the considered multistage stochastic programs have a periodical behavior. We show that in such cases it is possible to drastically reduce the number of stages by introducing a periodical analogue of the so-called Bellman equations for discounted infinite horizon problems used in Markov decision processes
-
Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation SIAM J. Optim. (IF 2.247) Pub Date : 2020-08-03 Etienne De Klerk; François Glineur; Adrien B. Taylor
SIAM Journal on Optimization, Volume 30, Issue 3, Page 2053-2082, January 2020. We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions, including the case of inexact search directions. The analysis uses semidefinite programming performance estimation, as pioneered
Contents have been reproduced by permission of the publishers.