-
On Integrality in Semidefinite Programming for Discrete Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-03-15 Frank de Meijer, Renata Sotirov
SIAM Journal on Optimization, Volume 34, Issue 1, Page 1071-1096, March 2024. Abstract. It is well known that by adding integrality constraints to the semidefinite programming (SDP) relaxation of the max-cut problem, the resulting integer semidefinite program is an exact formulation of the problem. In this paper we show similar results for a wide variety of discrete optimization problems for which
-
Randomized Douglas–Rachford Methods for Linear Systems: Improved Accuracy and Efficiency SIAM J. Optim. (IF 3.1) Pub Date : 2024-03-15 Deren Han, Yansheng Su, Jiaxin Xie
SIAM Journal on Optimization, Volume 34, Issue 1, Page 1045-1070, March 2024. Abstract. The Douglas–Rachford (DR) method is a widely used method for finding a point in the intersection of two closed convex sets (feasibility problem). However, the method converges weakly, and the associated rate of convergence is hard to analyze in general. In addition, the direct extension of the DR method for solving
-
Decentralized Gradient Descent Maximization Method for Composite Nonconvex Strongly-Concave Minimax Problems SIAM J. Optim. (IF 3.1) Pub Date : 2024-03-12 Yangyang Xu
SIAM Journal on Optimization, Volume 34, Issue 1, Page 1006-1044, March 2024. Abstract. Minimax problems have recently attracted a lot of research interests. A few efforts have been made to solve decentralized nonconvex strongly-concave (NCSC) minimax-structured optimization; however, all of them focus on smooth problems with at most a constraint on the maximization variable. In this paper, we make
-
A Two-Time-Scale Stochastic Optimization Framework with Applications in Control and Reinforcement Learning SIAM J. Optim. (IF 3.1) Pub Date : 2024-03-08 Sihan Zeng, Thinh T. Doan, Justin Romberg
SIAM Journal on Optimization, Volume 34, Issue 1, Page 946-976, March 2024. Abstract. We study a new two-time-scale stochastic gradient method for solving optimization problems, where the gradients are computed with the aid of an auxiliary variable under samples generated by time-varying Markov random processes controlled by the underlying optimization variable. These time-varying samples make gradient
-
How Do Exponential Size Solutions Arise in Semidefinite Programming? SIAM J. Optim. (IF 3.1) Pub Date : 2024-03-08 Gábor Pataki, Aleksandr Touzov
SIAM Journal on Optimization, Volume 34, Issue 1, Page 977-1005, March 2024. Abstract. A striking pathology of semidefinite programs (SDPs) is illustrated by a classical example of Khachiyan: feasible solutions in SDPs may need exponential space even to write down. Such exponential size solutions are the main obstacle to solving a long standing, fundamental open problem: can we decide feasibility of
-
A Chain Rule for Strict Twice Epi-Differentiability and Its Applications SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-29 Nguyen T. V. Hang, M. Ebrahim Sarabi
SIAM Journal on Optimization, Volume 34, Issue 1, Page 918-945, March 2024. Abstract. The presence of second-order smoothness for objective functions of optimization problems can provide valuable information about their stability properties and help us design efficient numerical algorithms for solving these problems. Such second-order information, however, cannot be expected in various constrained
-
Approximating Higher-Order Derivative Tensors Using Secant Updates SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-28 Karl Welzel, Raphael A. Hauser
SIAM Journal on Optimization, Volume 34, Issue 1, Page 893-917, March 2024. Abstract. Quasi-Newton methods employ an update rule that gradually improves the Hessian approximation using the already available gradient evaluations. We propose higher-order secant updates which generalize this idea to higher-order derivatives, approximating, for example, third derivatives (which are tensors) from given
-
Continuous Selections of Solutions to Parametric Variational Inequalities SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-28 Shaoning Han, Jong-Shi Pang
SIAM Journal on Optimization, Volume 34, Issue 1, Page 870-892, March 2024. Abstract. This paper studies the existence of a (Lipschitz) continuous (single-valued) solution function of parametric variational inequalities under functional and constraint perturbations. At the most elementary level, this issue can be explained from classical parametric linear programming and its resolution by the parametric
-
Sample Size Estimates for Risk-Neutral Semilinear PDE-Constrained Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-23 Johannes Milz, Michael Ulbrich
SIAM Journal on Optimization, Volume 34, Issue 1, Page 844-869, March 2024. Abstract. The sample average approximation (SAA) approach is applied to risk-neutral optimization problems governed by semilinear elliptic partial differential equations with random inputs. After constructing a compact set that contains the SAA critical points, we derive nonasymptotic sample size estimates for SAA critical
-
Subset Selection and the Cone of Factor-Width-k Matrices SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-22 Walid Ben-Ameur
SIAM Journal on Optimization, Volume 34, Issue 1, Page 817-843, March 2024. Abstract. We study the cone of factor-width-[math] matrices, where the factor width of a positive semidefinite matrix is defined as the smallest number [math] allowing it to be expressed as a sum of positive semidefinite matrices that are nonzero only on a single [math] principal submatrix. Two hierarchies of approximations
-
A Path-Based Approach to Constrained Sparse Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-21 Nadav Hallak
SIAM Journal on Optimization, Volume 34, Issue 1, Page 790-816, March 2024. Abstract. This paper proposes a path-based approach for the minimization of a continuously differentiable function over sparse symmetric sets, which is a hard problem that exhibits a restrictiveness-hierarchy of necessary optimality conditions. To achieve the more restrictive conditions in the hierarchy, state-of-the-art algorithms
-
Accelerating Primal-Dual Methods for Regularized Markov Decision Processes SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-20 Haoya Li, Hsiang-Fu Yu, Lexing Ying, Inderjit S. Dhillon
SIAM Journal on Optimization, Volume 34, Issue 1, Page 764-789, March 2024. Abstract. Entropy regularized Markov decision processes have been widely used in reinforcement learning. This paper is concerned with the primal-dual formulation of the entropy regularized problems. Standard first-order methods suffer from slow convergence due to the lack of strict convexity and concavity. To address this issue
-
Safe and Verified Gomory Mixed-Integer Cuts in a Rational Mixed-Integer Program Framework SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-16 Leon Eifler, Ambros Gleixner
SIAM Journal on Optimization, Volume 34, Issue 1, Page 742-763, March 2024. Abstract. This paper is concerned with the exact solution of mixed-integer programs (MIPs) over the rational numbers, i.e., without any roundoff errors and error tolerances. Here, one computational bottleneck that should be avoided whenever possible is to employ large-scale symbolic computations. Instead it is often possible
-
Linear Programming on the Stiefel Manifold SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-15 Mengmeng Song, Yong Xia
SIAM Journal on Optimization, Volume 34, Issue 1, Page 718-741, March 2024. Abstract. Linear programming on the Stiefel manifold (LPS) is studied for the first time. It aims at minimizing a linear objective function over the set of all [math]-tuples of orthonormal vectors in [math] satisfying [math] additional linear constraints. Despite the classical polynomial-time solvable case [math], general (LPS)
-
Bounds for Multistage Mixed-Integer Distributionally Robust Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-13 Güzin Bayraksan, Francesca Maggioni, Daniel Faccini, Ming Yang
SIAM Journal on Optimization, Volume 34, Issue 1, Page 682-717, March 2024. Abstract. Multistage mixed-integer distributionally robust optimization (DRO) forms a class of extremely challenging problems since their size grows exponentially with the number of stages. One way to model the uncertainty in multistage DRO is by creating sets of conditional distributions (the so-called conditional ambiguity
-
A Riemannian Proximal Newton Method SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-09 Wutao Si, P.-A. Absil, Wen Huang, Rujun Jiang, Simon Vary
SIAM Journal on Optimization, Volume 34, Issue 1, Page 654-681, March 2024. Abstract. In recent years, the proximal gradient method and its variants have been generalized to Riemannian manifolds for solving optimization problems with an additively separable structure, i.e., [math], where [math] is continuously differentiable, and [math] may be nonsmooth but convex with computationally reasonable proximal
-
Various Notions of Nonexpansiveness Coincide for Proximal Mappings of Functions SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-09 Honglin Luo, Xianfu Wang, Xinmin Yang
SIAM Journal on Optimization, Volume 34, Issue 1, Page 642-653, March 2024. Abstract. Proximal mappings are essential in splitting algorithms for both convex and nonconvex optimization. In this paper, we show that proximal mappings of every prox-bounded function are nonexpansive if and only if they are firmly nonexpansive if and only if they are averaged if and only if the function is convex. Lipschitz
-
Second Order Conditions to Decompose Smooth Functions as Sums of Squares SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-08 Ulysse Marteau-Ferey, Francis Bach, Alessandro Rudi
SIAM Journal on Optimization, Volume 34, Issue 1, Page 616-641, March 2024. Abstract. We consider the problem of decomposing a regular nonnegative function as a sum of squares of functions which preserve some form of regularity. In the same way as decomposing nonnegative polynomials as sum of squares of polynomials allows one to derive methods in order to solve global optimization problems on polynomials
-
Harmonic Hierarchies for Polynomial Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-06 Sergio Cristancho, Mauricio Velasco
SIAM Journal on Optimization, Volume 34, Issue 1, Page 590-615, March 2024. Abstract. We introduce novel polyhedral approximation hierarchies for the cone of nonnegative forms on the unit sphere in [math] and for its (dual) cone of moments. We prove computable quantitative bounds on the speed of convergence of such hierarchies. We also introduce a novel optimization-free algorithm for building converging
-
Convergence Rate Analysis of a Dykstra-Type Projection Algorithm SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-06 Xiaozhou Wang, Ting Kei Pong
SIAM Journal on Optimization, Volume 34, Issue 1, Page 563-589, March 2024. Abstract. Given closed convex sets [math], [math], and some nonzero linear maps [math], [math], of suitable dimensions, the multiset split feasibility problem aims at finding a point in [math] based on computing projections onto [math] and multiplications by [math] and [math]. In this paper, we consider the associated best
-
Exact Quantization of Multistage Stochastic Linear Problems SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-05 Maël Forcier, Stéphane Gaubert, Vincent Leclère
SIAM Journal on Optimization, Volume 34, Issue 1, Page 533-562, March 2024. Abstract. We show that the multistage stochastic linear problem (MSLP) with an arbitrary cost distribution is equivalent to an MSLP on a finite scenario tree. We establish this exact quantization result by analyzing the polyhedral structure of MSLPs. In particular, we show that the expected cost-to-go functions are polyhedral
-
Shortest Paths in Graphs of Convex Sets SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-01 Tobia Marcucci, Jack Umenberger, Pablo Parrilo, Russ Tedrake
SIAM Journal on Optimization, Volume 34, Issue 1, Page 507-532, March 2024. Abstract. Given a graph, the shortest-path problem requires finding a sequence of edges with minimum cumulative length that connects a source vertex to a target vertex. We consider a variant of this classical problem in which the position of each vertex in the graph is a continuous decision variable constrained in a convex
-
Hybrid Algorithms for Finding a D-Stationary Point of a Class of Structured Nonsmooth DC Minimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-02-01 Zhe Sun, Lei Wu
SIAM Journal on Optimization, Volume 34, Issue 1, Page 485-506, March 2024. Abstract. In this paper, we consider a class of structured nonsmooth difference-of-convex (DC) minimization in which the first convex component is the sum of a smooth and a nonsmooth function, while the second convex component is the supremum of finitely many convex smooth functions. The existing methods for this problem usually
-
Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-31 David Applegate, Mateo Díaz, Haihao Lu, Miles Lubin
SIAM Journal on Optimization, Volume 34, Issue 1, Page 459-484, March 2024. Abstract. We study the problem of detecting infeasibility of large-scale linear programming problems using the primal-dual hybrid gradient (PDHG) method of Chambolle and Pock [J. Math. Imaging Vision, 40 (2011), pp. 120–145]. The literature on PDHG has focused chiefly on problems with at least one optimal solution. We show
-
Distributionally Favorable Optimization: A Framework for Data-Driven Decision-Making with Endogenous Outliers SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-29 Nan Jiang, Weijun Xie
SIAM Journal on Optimization, Volume 34, Issue 1, Page 419-458, March 2024. Abstract. A typical data-driven stochastic program seeks the best decision that minimizes the sum of a deterministic cost function and an expected recourse function under a given distribution. Recently, much success has been witnessed in the development of distributionally robust optimization (DRO), which considers the worst-case
-
Bayesian Stochastic Gradient Descent for Stochastic Optimization with Streaming Input Data SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-25 Tianyi Liu, Yifan Lin, Enlu Zhou
SIAM Journal on Optimization, Volume 34, Issue 1, Page 389-418, March 2024. Abstract. We consider stochastic optimization under distributional uncertainty, where the unknown distributional parameter is estimated from streaming data that arrive sequentially over time. Moreover, data may depend on the decision at the time when they are generated. For both decision-independent and decision-dependent uncertainties
-
Basic Convex Analysis in Metric Spaces with Bounded Curvature SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-19 Adrian S. Lewis, Genaro López-Acedo, Adriana Nicolae
SIAM Journal on Optimization, Volume 34, Issue 1, Page 366-388, March 2024. Abstract. Differentiable structure ensures that many of the basics of classical convex analysis extend naturally from Euclidean space to Riemannian manifolds. Without such structure, however, extensions are more challenging. Nonetheless, in Alexandrov spaces with curvature bounded above (but possibly positive), we develop several
-
Descent Properties of an Anderson Accelerated Gradient Method with Restarting SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-19 Wenqing Ouyang, Yang Liu, Andre Milzarek
SIAM Journal on Optimization, Volume 34, Issue 1, Page 336-365, March 2024. Abstract. Anderson acceleration ([math]) is a popular acceleration technique to enhance the convergence of fixed-point schemes. The analysis of [math] approaches often focuses on the convergence behavior of a corresponding fixed-point residual, while the behavior of the underlying objective function values along the accelerated
-
A Decomposition Algorithm for Two-Stage Stochastic Programs with Nonconvex Recourse Functions SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-19 Hanyang Li, Ying Cui
SIAM Journal on Optimization, Volume 34, Issue 1, Page 306-335, March 2024. Abstract. In this paper, we have studied a decomposition method for solving a class of nonconvex two-stage stochastic programs, where both the objective and constraints of the second-stage problem are nonlinearly parameterized by the first-stage variables. Due to the failure of the Clarke regularity of the resulting nonconvex
-
Sharper Bounds for Proximal Gradient Algorithms with Errors SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-19 Anis Hamadouche, Yun Wu, Andrew M. Wallace, João F. C. Mota
SIAM Journal on Optimization, Volume 34, Issue 1, Page 278-305, March 2024. Abstract. We analyze the convergence of the proximal gradient algorithm for convex composite problems in the presence of gradient and proximal computational inaccuracies. We generalize the deterministic analysis to the quasi-Fejér case and quantify the uncertainty incurred from approximate computing and early termination errors
-
Continuous Newton-like Methods Featuring Inertia and Variable Mass SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-18 Camille Castera, Hedy Attouch, Jalal Fadili, Peter Ochs
SIAM Journal on Optimization, Volume 34, Issue 1, Page 251-277, March 2024. Abstract. We introduce a new dynamical system at the interface between second-order dynamics with inertia and Newton’s method. This system extends the class of inertial Newton-like dynamics by featuring a time-dependent parameter in front of the acceleration, called variable mass. For strongly convex optimization, we provide
-
Nonlinear Cone Separation Theorems in Real Topological Linear Spaces SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-16 Christian Günther, Bahareh Khazayel, Christiane Tammer
SIAM Journal on Optimization, Volume 34, Issue 1, Page 225-250, March 2024. Abstract. The separation of two sets (or more specific of two cones) plays an important role in different fields of mathematics such as variational analysis, convex analysis, convex geometry, and optimization. In the paper, we derive some new results for the separation of two not necessarily convex cones by a (convex) cone/conical
-
Global Complexity Bound of a Proximal ADMM for Linearly Constrained Nonseparable Nonconvex Composite Programming SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-11 Weiwei Kong, Renato D. C. Monteiro
SIAM Journal on Optimization, Volume 34, Issue 1, Page 201-224, March 2024. Abstract. This paper proposes and analyzes a dampened proximal alternating direction method of multipliers (DP.ADMM) for solving linearly constrained nonconvex optimization problems where the smooth part of the objective function is nonseparable. Each iteration of DP.ADMM consists of (i) a sequence of partial proximal augmented
-
New Bounds for the Integer Carathéodory Rank SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-09 Iskander Aliev, Martin Henk, Mark Hogan, Stefan Kuhlmann, Timm Oertel
SIAM Journal on Optimization, Volume 34, Issue 1, Page 190-200, March 2024. Abstract. Given a rational pointed [math]-dimensional cone [math], we study the integer Carathéodory rank [math] and its asymptotic form [math], where we consider “most” integer vectors in the cone. The main result significantly improves the previously known upper bound for [math]. We also study bounds on [math] in terms of
-
Optimal Algorithms for Stochastic Complementary Composite Minimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-05 Alexandre d’Aspremont, Cristóbal Guzmán, Clément Lezane
SIAM Journal on Optimization, Volume 34, Issue 1, Page 163-189, March 2024. Abstract. Inspired by regularization techniques in statistics and machine learning, we study complementary composite minimization in the stochastic setting. This problem corresponds to the minimization of the sum of a (weakly) smooth function endowed with a stochastic first-order oracle and a structured uniformly convex (possibly
-
A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-05 Zheng Qu, Xindong Tang
SIAM Journal on Optimization, Volume 34, Issue 1, Page 127-162, March 2024. Abstract. In this paper, we consider polynomial optimization with correlative sparsity. We construct correlatively sparse Lagrange multiplier expressions (CS-LMEs) and propose CS-LME reformulations for polynomial optimization problems using the Karush–Kuhn–Tucker optimality conditions. Correlatively sparse sum-of-squares (CS-SOS)
-
Aggregations of Quadratic Inequalities and Hidden Hyperplane Convexity SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-05 Grigoriy Blekherman, Santanu S. Dey, Shengding Sun
SIAM Journal on Optimization, Volume 34, Issue 1, Page 98-126, March 2024. Abstract. We study properties of the convex hull of a set [math] described by quadratic inequalities. A simple way of generating inequalities valid on [math] is to take nonnegative linear combinations of the defining inequalities of [math]. We call such inequalities aggregations. Special aggregations naturally contain the convex
-
Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-04 Jérôme Bolte, Edouard Pauwels, Antonio Silveti-Falls
SIAM Journal on Optimization, Volume 34, Issue 1, Page 71-97, March 2024. Abstract. We leverage path differentiability and a recent result on nonsmooth implicit differentiation calculus to give sufficient conditions ensuring that the solution to a monotone inclusion problem will be path differentiable, with formulas for computing its generalized gradient. A direct consequence of our result is that
-
Sufficient Conditions for Instability of the Subgradient Method with Constant Step Size SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-04 Cédric Josz, Lexiao Lai
SIAM Journal on Optimization, Volume 34, Issue 1, Page 57-70, March 2024. Abstract. We provide sufficient conditions for instability of the subgradient method with constant step size around a local minimum of a locally Lipschitz semialgebraic function. They are satisfied by several spurious local minima arising in robust principal component analysis and neural networks.
-
Super-Universal Regularized Newton Method SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-03 Nikita Doikov, Konstantin Mishchenko, Yurii Nesterov
SIAM Journal on Optimization, Volume 34, Issue 1, Page 27-56, March 2024. Abstract. We analyze the performance of a variant of the Newton method with quadratic regularization for solving composite convex minimization problems. At each step of our method, we choose a regularization parameter proportional to a certain power of the gradient norm at the current point. We introduce a family of problem classes
-
Time-Varying Semidefinite Programming: Path Following a Burer–Monteiro Factorization SIAM J. Optim. (IF 3.1) Pub Date : 2024-01-03 Antonio Bellon, Mareike Dressler, Vyacheslav Kungurtsev, Jakub Mareček, André Uschmajew
SIAM Journal on Optimization, Volume 34, Issue 1, Page 1-26, March 2024. Abstract. We present an online algorithm for time-varying semidefinite programs (TV-SDPs), based on the tracking of the solution trajectory of a low-rank matrix factorization, also known as the Burer–Monteiro factorization, in a path-following procedure. There, a predictor-corrector algorithm solves a sequence of linearized systems
-
Convex Bi-level Optimization Problems with Nonsmooth Outer Objective Function SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-10 Roey Merchav, Shoham Sabach
SIAM Journal on Optimization, Volume 33, Issue 4, Page 3114-3142, December 2023. Abstract. In this paper, we propose the Bi-Sub-Gradient (Bi-SG) method, which is a generalization of the classical sub-gradient method to the setting of convex bi-level optimization problems. This is a first-order method that is very easy to implement in the sense that it requires only a computation of the associated proximal
-
Large-Scale Nonconvex Optimization: Randomization, Gap Estimation, and Numerical Resolution SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-10 J. Frédéric Bonnans, Kang Liu, Nadia Oudjane, Laurent Pfeiffer, Cheng Wan
SIAM Journal on Optimization, Volume 33, Issue 4, Page 3083-3113, December 2023. Abstract. We address a large-scale and nonconvex optimization problem, involving an aggregative term. This term can be interpreted as the sum of the contributions of [math] agents to some common good, with [math] large. We investigate a relaxation of this problem, obtained by randomization. The relaxation gap is proved
-
Direct Search Based on Probabilistic Descent in Reduced Spaces SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-10 Lindon Roberts, Clément W. Royer
SIAM Journal on Optimization, Volume 33, Issue 4, Page 3057-3082, December 2023. Abstract. Derivative-free algorithms seek the minimum value of a given objective function without using any derivative information. The performance of these methods often worsens as the dimension increases, a phenomenon predicted by their worst-case complexity guarantees. Nevertheless, recent algorithmic proposals have
-
Convergence Analysis of the Proximal Gradient Method in the Presence of the Kurdyka–Łojasiewicz Property Without Global Lipschitz Assumptions SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-09 Xiaoxi Jia, Christian Kanzow, Patrick Mehlitz
SIAM Journal on Optimization, Volume 33, Issue 4, Page 3038-3056, December 2023. Abstract. We consider a composite optimization problem where the sum of a continuously differentiable and a merely lower semicontinuous function has to be minimized. The proximal gradient algorithm is the classical method for solving such a problem numerically. The corresponding global convergence and local rate-of-convergence
-
Convergence of the Momentum Method for Semialgebraic Functions with Locally Lipschitz Gradients SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-06 Cédric Josz, Lexiao Lai, Xiaopeng Li
SIAM Journal on Optimization, Volume 33, Issue 4, Page 3012-3037, December 2023. Abstract. We propose a new length formula that governs the iterates of the momentum method when minimizing differentiable semialgebraic functions with locally Lipschitz gradients. It enables us to establish local convergence, global convergence, and convergence to local minimizers without assuming global Lipschitz continuity
-
Strong Variational Sufficiency for Nonlinear Semidefinite Programming and Its Implications SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-06 Shiwei Wang, Chao Ding, Yangjing Zhang, Xinyuan Zhao
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2988-3011, December 2023. Abstract. Strong variational sufficiency is a newly proposed property, which turns out to be of great use in the convergence analysis of multiplier methods. However, what this property implies for nonpolyhedral problems remains a puzzle. In this paper, we prove the equivalence between the strong variational sufficiency
-
Universal Conditional Gradient Sliding for Convex Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2023-11-03 Yuyuan Ouyang, Trevor Squires
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2962-2987, December 2023. Abstract. In this paper, we present a first-order projection-free method, namely, the universal conditional gradient sliding (UCGS) method, for computing [math]-approximate solutions to convex differentiable optimization problems with bounded domains. For objective functions with Hölder continuous gradients under the Euclidean
-
Cyclic Coordinate Dual Averaging with Extrapolation SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-25 Chaobing Song, Jelena Diakonikolas
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2935-2961, December 2023. Abstract. Cyclic block coordinate methods are a fundamental class of optimization methods widely used in practice and implemented as part of standard software packages for statistical learning. Nevertheless, their convergence is generally not well understood and so far their good practical performance has not been explained
-
The Bipartite Boolean Quadric Polytope with Multiple-Choice Constraints SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-19 Andreas Bärmann, Alexander Martin, Oskar Schneider
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2909-2934, December 2023. Abstract. We consider the bipartite boolean quadric polytope (BQP) with multiple-choice constraints and analyze its combinatorial properties. The well-studied BQP is defined as the convex hull of all quadric incidence vectors over a bipartite graph. In this work, we study the case where there is a partition on one of the
-
Sion’s Minimax Theorem in Geodesic Metric Spaces and a Riemannian Extragradient Algorithm SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-18 Peiyuan Zhang, Jingzhao Zhang, Suvrit Sra
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2885-2908, December 2023. Abstract. Deciding whether saddle points exist or are approximable for nonconvex-nonconcave problems is usually intractable. This paper takes a step towards understanding a broad class of nonconvex-nonconcave minimax problems that do remain tractable. Specifically, it studies minimax problems over geodesic metric spaces
-
Two-Stage Robust Quadratic Optimization with Equalities and Its Application to Optimal Power Flow SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-16 Olga Kuryatnikova, Bissan Ghaddar, Daniel K. Molzahn
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2830-2857, December 2023. Abstract. In this work, we consider two-stage quadratic optimization problems under ellipsoidal uncertainty. In the first stage, one needs to decide upon the values of a subset of optimization variables (control variables). In the second stage, the uncertainty is revealed, and the rest of the optimization variables (state
-
Optimal Self-Concordant Barriers for Quantum Relative Entropies SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-17 Hamza Fawzi, James Saunderson
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2858-2884, December 2023. Abstract. Quantum relative entropies are jointly convex functions of two positive definite matrices that generalize the Kullback–Leibler divergence and arise naturally in quantum information theory. In this paper, we prove self-concordance of natural barrier functions for the epigraphs of various quantum relative entropies
-
An Improved Unconstrained Approach for Bilevel Optimization SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Xiaoyin Hu, Nachuan Xiao, Xin Liu, Kim-Chuan Toh
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2801-2829, December 2023. Abstract. In this paper, we focus on the nonconvex-strongly-convex bilevel optimization problem (BLO). In this BLO, the objective function of the upper-level problem is nonconvex and possibly nonsmooth, and the lower-level problem is smooth and strongly convex with respect to the underlying variable [math]. We show that
-
Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Serge Gratton, Alena Kopaničáková, Philippe L. Toint
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2772-2800, December 2023. Abstract. A class of multilevel algorithms for unconstrained nonlinear optimization is presented which does not require the evaluation of the objective function. The class contains the momentum-less AdaGrad method as a particular (single-level) instance. The choice of avoiding the evaluation of the objective function is
-
Evolution of Mixed Strategies in Monotone Games SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Ryan Hynd
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2750-2771, December 2023. Abstract. We consider the basic problem of approximating Nash equilibria in noncooperative games. For monotone games, we design continuous time flows which converge in an averaged sense to Nash equilibria. We also study mean field equilibria, which arise in the large player limit of symmetric noncooperative games. In this
-
Dualities for Non-Euclidean Smoothness and Strong Convexity under the Light of Generalized Conjugacy SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Emanuel Laude, Andreas Themelis, Panagiotis Patrinos
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2721-2749, December 2023. Abstract. Relative smoothness and strong convexity have recently gained considerable attention in optimization. These notions are generalizations of the classical Euclidean notions of smoothness and strong convexity that are known to be dual to each other. However, conjugate dualities for non-Euclidean relative smoothness
-
Locating Theorems of Differential Inclusions Governed by Maximally Monotone Operators SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Minh N. Dao, Hassan Saoud, Michel A. Théra
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2703-2720, December 2023. Abstract. In this paper, we are interested in studying the asymptotic behavior of the solutions of differential inclusions governed by maximally monotone operators. In the case where the LaSalle’s invariance principle is inconclusive, we provide a refined version of the invariance principle theorem. This result derives
-
Minimax Problems with Coupled Linear Constraints: Computational Complexity and Duality SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Ioannis Tsaknakis, Mingyi Hong, Shuzhong Zhang
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2675-2702, December 2023. Abstract. In this work we study a special minimax problem where there are linear constraints that couple both the minimization and maximization decision variables. The problem is a generalization of the traditional saddle point problem (which does not have the coupling constraint), and it finds applications in wireless
-
Affine Invariant Convergence Rates of the Conditional Gradient Method SIAM J. Optim. (IF 3.1) Pub Date : 2023-10-13 Javier F. Peña
SIAM Journal on Optimization, Volume 33, Issue 4, Page 2654-2674, December 2023. Abstract. We show that the conditional gradient method for the convex composite problem [math] generates primal and dual iterates with a duality gap converging to zero provided a suitable growth property holds and the algorithm makes a judicious choice of stepsizes. The rate of convergence of the duality gap to zero ranges