样式: 排序: IF: - GO 导出 标记为已读
-
Decentralized gradient tracking with local steps Optim. Methods Softw. (IF 2.2) Pub Date : 2024-03-14 Yue Liu, Tao Lin, Anastasia Koloskova, Sebastian U. Stich
Gradient tracking (GT) is an algorithm designed for solving decentralized optimization problems over a network (such as training a machine learning model). A key feature of GT is a tracking mechani...
-
Toward state estimation by high gain differentiators with automatic differentiation Optim. Methods Softw. (IF 2.2) Pub Date : 2024-03-14 Klaus Röbenack, Daniel Gerbet
Most applications of automatic differentiation concern the field of optimization in the broadest sense. This means that many applications only need first and second order derivatives. An exception ...
-
Bilevel optimization with a multi-objective lower-level problem: risk-neutral and risk-averse formulations Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-20 T. Giovannelli, G. D. Kent, L. N. Vicente
In this work, we propose different formulations and gradient-based algorithms for deterministic and stochastic bilevel problems with conflicting objectives in the lower level. Such problems have re...
-
Complexity of a class of first-order objective-function-free optimization algorithms Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-08 S. Gratton, S. Jerad, Ph. L. Toint
A parametric class of trust-region algorithms for unconstrained non-convex optimization is considered where the value of the objective function is never computed. The class contains a deterministic...
-
Sequential hierarchical least-squares programming for prioritized non-linear optimal control Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-08 Kai Pfeiffer, Abderrahmane Kheddar
We present a sequential hierarchical least-squares programming solver with trust-region and hierarchical step-filter with application to prioritized discrete non-linear optimal control. It is based...
-
Optimal inexactness schedules for tunable oracle-based methods Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-05 Guillaume Van Dessel, François Glineur
Several recent works address the impact of inexact oracles in the convergence analysis of modern first-order optimization techniques, e.g. Bregman Proximal Gradient and Prox-Linear methods as well ...
-
PathWyse: a flexible, open-source library for the resource constrained shortest path problem Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-05 Matteo Salani, Saverio Basso, Vincenzo Giuffrida
In this paper, we consider a fundamental and hard combinatorial problem: the Resource Constrained Shortest Path Problem (RCSPP). We describe the implementation of a flexible, open-source library fo...
-
Optimized convergence of stochastic gradient descent by weighted averaging Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-05 Melinda Hagedorn, Florian Jarre
Under mild assumptions stochastic gradient methods asymptotically achieve an optimal rate of convergence if the arithmetic mean of all iterates is returned as an approximate optimal solution. Howev...
-
Three-operator reflected forward-backward splitting algorithm with double inertial effects Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-05 Qiao-Li Dong, Min Su, Yekini Shehu
In this paper, we propose a reflected forward-backward splitting algorithm with two different inertial extrapolations to find a zero of the sum of three monotone operators consisting of the maximal...
-
Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods Optim. Methods Softw. (IF 2.2) Pub Date : 2024-02-05 Pavel Dvurechensky, Petr Ostroukhov, Alexander Gasnikov, César A. Uribe, Anastasiya Ivanova
Motivated, in particular, by the entropy-regularized optimal transport problem, we consider convex optimization problems with linear equality constraints, where the dual objective has Lipschitz pth...
-
Incremental quasi-Newton algorithms for solving a nonconvex, nonsmooth, finite-sum optimization problem Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-28 Gulcin Dinc Yalcin, Frank E. Curtis
Algorithms for solving certain nonconvex, nonsmooth, finite-sum optimization problems are proposed and tested. In particular, the algorithms are proposed and tested in the context of a transductive...
-
Predicting pairwise interaction affinities with ℓ0-penalized least squares–a nonsmooth bi-objective optimization based approach* Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-24 Pauliina Paasivirta, Riikka Numminen, Antti Airola, Napsu Karmitsa, Tapio Pahikkala
In this paper, we introduce a novel nonsmooth optimization-based method LMBM-Kronℓ0LS for solving large-scale pairwise interaction affinity prediction problems. The aim of LMBM-Kronℓ0LS is to produ...
-
Barzilai–Borwein-like rules in proximal gradient schemes for ℓ1-regularized problems Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-24 Serena Crisci, Simone Rebegoldi, Gerardo Toraldo, Marco Viola
We propose a novel steplength selection rule in proximal gradient methods for minimizing the sum of a differentiable function plus an ℓ1-norm penalty term. The proposed rule modifies one of the cla...
-
Delay-tolerant distributed Bregman proximal algorithms Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-24 S. Chraibi, F. Iutzeler, J. Malick, A. Rogozin
Many problems in machine learning write as the minimization of a sum of individual loss functions over the training examples. These functions are usually differentiable but, in some cases, their gr...
-
Shape-changing trust-region methods using multipoint symmetric secant matrices Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-24 J. J. Brust, J. B. Erway, R. F. Marcia
In this work, we consider methods for large-scale and nonconvex unconstrained optimization. We propose a new trust-region method whose subproblem is defined using a so-called ‘shape-changing’ norm ...
-
Decentralized saddle point problems via non-Euclidean mirror prox Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-24 Alexander Rogozin, Aleksandr Beznosikov, Darina Dvinskikh, Dmitry Kovalev, Pavel Dvurechensky, Alexander Gasnikov
We consider smooth convex-concave saddle point problems in the decentralized distributed setting, where a finite-sum objective is distributed among the nodes of a computational network. At each nod...
-
The use of a family of Gerstewitz scalarization functions in the context of vector optimization with variable domination structures to derive scalarization results Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-19 Lam Quoc Anh, Tran Ngoc Tam
In this paper, we study a nonlinear scalarization function for a variable domination structure in an arbitrary linear space without assuming any particular topology. Conditions are provided under w...
-
Second-order cone programming for frictional contact mechanics using interior point algorithm Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-16 Vincent Acary, Paul Armand, Hoang Minh Nguyen, Maksym Shpakovych
We report experiments of an implementation of a primal–dual interior point algorithm for solving mechanical models of one-sided contact problems with Coulomb friction. The objective is to recover a...
-
On a Frank-Wolfe approach for abs-smooth functions Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-16 Timo Kreimeier, Sebastian Pokutta, Andrea Walther, Zev Woodstock
We propose an algorithm which appears to be the first bridge between the fields of conditional gradient methods and abs-smooth optimization. Our problem setting is motivated by various applications...
-
A gradient descent akin method for constrained optimization: algorithms and applications Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-16 Long Chen, Kai-Uwe Bletzinger, Nicolas R. Gauger, Yinyu Ye
We present a ‘gradient descent akin’ method (GDAM) for constrained optimization problem, i.e. the search direction is computed using a linear combination of the negative and normalized objective an...
-
PersA-FL: personalized asynchronous federated learning Optim. Methods Softw. (IF 2.2) Pub Date : 2024-01-11 Mohammad Taha Toghani, Soomin Lee, César A. Uribe
We study the personalized federated learning problem under asynchronous updates. In this problem, each client seeks to obtain a personalized model that simultaneously outperforms local and global m...
-
An ADMM based method for underdetermined box-constrained integer least squares problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-12-28 Xiao-Wen Chang, Tianchi Ma
To solve underdetermined box-constrained integer least squares (UBILS) problems, we propose an integer-constrained alternating direction method of multipliers (IADMM), which can be much more accura...
-
Customized Douglas-Rachford splitting methods for structured inverse variational inequality problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-24 Y. N. Jiang, X. J. Cai, D. R. Han, J. F. Yang
Recently, structured inverse variational inequality (SIVI) problems have attracted much attention. In this paper, we propose new splitting methods to solve SIVI problems by employing the idea of th...
-
Customized Douglas-Rachford splitting methods for structured inverse variational inequality problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-24 Y. N. Jiang, X. J. Cai, D. R. Han, J. F. Yang
Recently, structured inverse variational inequality (SIVI) problems have attracted much attention. In this paper, we propose new splitting methods to solve SIVI problems by employing the idea of th...
-
Dual formulation of the sparsity constrained optimization problem: application to classification Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-21 M. Gaudioso, G. Giallombardo, J.-B. Hiriart-Urruty
We tackle the sparsity constrained optimization problem by resorting to polyhedral k-norm as a valid tool to emulate the ℓ0-pseudo-norm. The main novelty of the approach is the use of the dual of t...
-
Inexact tensor methods and their application to stochastic convex optimization Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-17 Artem Agafonov, Dmitry Kamzolov, Pavel Dvurechensky, Alexander Gasnikov, Martin Takáč
We propose general non-accelerated [The results for non-accelerated methods first appeared in December 2020 in the preprint (A. Agafonov, D. Kamzolov, P. Dvurechensky, and A. Gasnikov, Inexact tens...
-
Inexact tensor methods and their application to stochastic convex optimization Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-17 Artem Agafonov, Dmitry Kamzolov, Pavel Dvurechensky, Alexander Gasnikov, Martin Takáč
We propose general non-accelerated [The results for non-accelerated methods first appeared in December 2020 in the preprint (A. Agafonov, D. Kamzolov, P. Dvurechensky, and A. Gasnikov, Inexact tens...
-
A hybrid direct search and projected simplex gradient method for convex constrained minimization Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-15 A. L. Custódio, E. H. M. Krulikovski, M. Raydan
We propose a new Derivative-free Optimization (DFO) approach for solving convex constrained minimization problems. The feasible set is assumed to be the non-empty intersection of a finite collectio...
-
More on second-order properties of the Moreau regularization-approximation of a convex function Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-14 J.-B. Hiriart-Urruty
We unify and improve existing results on the second-order differentiabilty of the so-called MOREAU's regularization of a convex function.
-
More on second-order properties of the Moreau regularization-approximation of a convex function Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-14 J.-B. Hiriart-Urruty
We unify and improve existing results on the second-order differentiabilty of the so-called MOREAU's regularization of a convex function.
-
A two-step new modulus-based matrix splitting method for vertical linear complementarity problem Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-09 Cuixia Li, Shiliang Wu
In this paper, for solving the vertical linear complementarity problem (VLCP) effectively, a two-step new modulus-based matrix splitting (TNMMS) iteration method is introduced. Its convergence prop...
-
A two-step new modulus-based matrix splitting method for vertical linear complementarity problem Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-09 Cuixia Li, Shiliang Wu
In this paper, for solving the vertical linear complementarity problem (VLCP) effectively, a two-step new modulus-based matrix splitting (TNMMS) iteration method is introduced. Its convergence prop...
-
Learning graph Laplacian with MCP Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-07 Yangjing Zhang, Kim-Chuan Toh, Defeng Sun
We consider the problem of learning a graph under the Laplacian constraint with a non-convex penalty: minimax concave penalty (MCP). For solving the MCP penalized graphical model, we design an inex...
-
Learning graph Laplacian with MCP Optim. Methods Softw. (IF 2.2) Pub Date : 2023-11-07 Yangjing Zhang, Kim-Chuan Toh, Defeng Sun
We consider the problem of learning a graph under the Laplacian constraint with a non-convex penalty: minimax concave penalty (MCP). For solving the MCP penalized graphical model, we design an inex...
-
Loraine – an interior-point solver for low-rank semidefinite programming Optim. Methods Softw. (IF 2.2) Pub Date : 2023-10-23 Soodeh Habibi, Michal Kočvara, Michael Stingl
The aim of this paper is to introduce a new code for the solution of large-and-sparse linear semidefinite programs (SDPs) with low-rank solutions or solutions with few outlying eigenvalues, and/or ...
-
Loraine – an interior-point solver for low-rank semidefinite programming Optim. Methods Softw. (IF 2.2) Pub Date : 2023-10-23 Soodeh Habibi, Michal Kočvara, Michael Stingl
The aim of this paper is to introduce a new code for the solution of large-and-sparse linear semidefinite programs (SDPs) with low-rank solutions or solutions with few outlying eigenvalues, and/or ...
-
AB/Push-Pull method for distributed optimization in time-varying directed networks Optim. Methods Softw. (IF 2.2) Pub Date : 2023-10-17 Angelia Nedić, Duong Thuy Anh Nguyen, Duong Tung Nguyen
In this paper, we study the distributed optimization problem for a system of agents embedded in time-varying directed communication networks. Each agent has its own cost function and agents coopera...
-
A note on the computational complexity of chain rule differentiation Optim. Methods Softw. (IF 2.2) Pub Date : 2023-10-17 U. Naumann
We generalize the proof of NP-completeness of Jacobian accumulation using a given number of floating-point operations to arbitrary order.
-
A note on the computational complexity of chain rule differentiation Optim. Methods Softw. (IF 2.2) Pub Date : 2023-10-17 U. Naumann
We generalize the proof of NP-completeness of Jacobian accumulation using a given number of floating-point operations to arbitrary order.
-
AB/Push-Pull method for distributed optimization in time-varying directed networks Optim. Methods Softw. (IF 2.2) Pub Date : 2023-10-17 Angelia Nedić, Duong Thuy Anh Nguyen, Duong Tung Nguyen
In this paper, we study the distributed optimization problem for a system of agents embedded in time-varying directed communication networks. Each agent has its own cost function and agents coopera...
-
IntSat: integer linear programming by conflict-driven constraint learning Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-27 Robert Nieuwenhuis, Albert Oliveras, Enric Rodríguez-Carbonell
State-of-the-art SAT solvers are nowadays able to handle huge real-world instances. The key to this success is the Conflict-Driven Clause-Learning (CDCL) scheme, which encompasses a number of techn...
-
A highly efficient algorithm for solving exclusive lasso problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-25 Meixia Lin, Yancheng Yuan, Defeng Sun, Kim-Chuan Toh
The exclusive lasso (also known as elitist lasso) regularizer has become popular recently due to its superior performance on intra-group feature selection. Its complex nature poses difficulties for...
-
Convergence analysis of stochastic higher-order majorization–minimization algorithms Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-25 Daniela Lupu, Ion Necoara
Majorization–minimization schemes are a broad class of iterative methods targeting general optimization problems, including nonconvex, nonsmooth and stochastic. These algorithms minimize successive...
-
A highly efficient algorithm for solving exclusive lasso problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-25 Meixia Lin, Yancheng Yuan, Defeng Sun, Kim-Chuan Toh
The exclusive lasso (also known as elitist lasso) regularizer has become popular recently due to its superior performance on intra-group feature selection. Its complex nature poses difficulties for...
-
Convergence analysis of stochastic higher-order majorization–minimization algorithms Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-25 Daniela Lupu, Ion Necoara
Majorization–minimization schemes are a broad class of iterative methods targeting general optimization problems, including nonconvex, nonsmooth and stochastic. These algorithms minimize successive...
-
GBOML: a structure-exploiting optimization modelling language in Python Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-08 Bardhyl Miftari, Mathias Berger, Guillaume Derval, Quentin Louveaux, Damien Ernst
Mixed-Integer Linear Programs (MILPs) have many practical applications. Most modelling tools for MILPs fall in two broad categories. Indeed, tools such as algebraic modelling languages allow practitioners to compactly encode models using syntax close to mathematical notation but usually lack support for special structures, while other tools instead provide predefined components that can be easily assembled
-
GBOML: a structure-exploiting optimization modelling language in Python Optim. Methods Softw. (IF 2.2) Pub Date : 2023-09-08 Bardhyl Miftari, Mathias Berger, Guillaume Derval, Quentin Louveaux, Damien Ernst
Mixed-Integer Linear Programs (MILPs) have many practical applications. Most modelling tools for MILPs fall in two broad categories. Indeed, tools such as algebraic modelling languages allow practitioners to compactly encode models using syntax close to mathematical notation but usually lack support for special structures, while other tools instead provide predefined components that can be easily assembled
-
Techniques for accelerating branch-and-bound algorithms dedicated to sparse optimization Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-31 Gwenaël Samain, Sébastien Bourguignon, Jordan Ninin
Sparse optimization–fitting data with a low-cardinality linear model–is addressed through the minimization of a cardinality-penalized least-squares function, for which dedicated branch-and-bound algorithms clearly outperform generic mixed-integer-programming solvers. Three acceleration techniques are proposed for such algorithms. Convex relaxation problems at each node are addressed with dual approaches
-
A class of projected-search methods for bound-constrained optimization Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-18 Michael W. Ferry, Philip E. Gill, Elizabeth Wong, Minxin Zhang
Projected-search methods for bound-constrained optimization are based on performing a search along a piecewise-linear continuous path obtained by projecting a search direction onto the feasible region. A potential benefit of a projected-search method is that many changes to the active set can be made at the cost of computing a single search direction. As the objective function is not differentiable
-
A new inertial projected reflected gradient method with application to optimal control problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-15 Chinedu Izuchukwu, Yekini Shehu
The projected reflected gradient method has been shown to be a simple and elegant method for solving variational inequalities. The method involves one projection onto the feasible set and one evaluation of the cost operator per iteration and has been shown numerically to be more efficient than most available methods for solving variational inequalities. Convergence results for methods with similar
-
A new inertial projected reflected gradient method with application to optimal control problems Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-15 Chinedu Izuchukwu, Yekini Shehu
The projected reflected gradient method has been shown to be a simple and elegant method for solving variational inequalities. The method involves one projection onto the feasible set and one evaluation of the cost operator per iteration and has been shown numerically to be more efficient than most available methods for solving variational inequalities. Convergence results for methods with similar
-
Distributionally robust joint chance-constrained programming with Wasserstein metric Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-08 Yining Gu, Yanjun Wang
In this paper, we develop an exact reformulation and a deterministic approximation for distributionally robust joint chance-constrained programmings (DRCCPs) with a general class of convex uncertain constraints under data-driven Wasserstein ambiguity sets. It is known that robust chance constraints can be conservatively approximated by worst-case conditional value-at-risk (CVaR) constraints. It is
-
The role of local steps in local SGD Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-07 Tiancheng Qin, S. Rasoul Etesami, César A. Uribe
We consider the distributed stochastic optimization problem where n agents want to minimize a global function given by the sum of agents' local functions and focus on the heterogeneous setting when agents' local functions are defined over non-i.i.d. datasets. We study the Local SGD method, where agents perform a number of local stochastic gradient steps and occasionally communicate with a central node
-
Distributionally robust joint chance-constrained programming with Wasserstein metric Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-08 Yining Gu, Yanjun Wang
In this paper, we develop an exact reformulation and a deterministic approximation for distributionally robust joint chance-constrained programmings (DRCCPs)(DRCCPs) with a general class of convex uncertain constraints under data-driven Wasserstein ambiguity sets. It is known that robust chance constraints can be conservatively approximated by worst-case conditional value-at-risk (CVaR) constraints
-
Urban air mobility: from complex tactical conflict resolution to network design and fairness insights Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-08 Mercedes Pelegrín, Claudia D'Ambrosio, Rémi Delmas, Youssef Hamadi
Urban Air Mobility (UAM) has the potential to revolutionize transportation. It will exploit the third dimension to help smooth ground traffic in densely populated areas. This new paradigm in mobili...
-
Urban air mobility: from complex tactical conflict resolution to network design and fairness insights Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-08 Mercedes Pelegrín, Claudia D'Ambrosio, Rémi Delmas, Youssef Hamadi
Urban Air Mobility (UAM) has the potential to revolutionize transportation. It will exploit the third dimension to help smooth ground traffic in densely populated areas. This new paradigm in mobili...
-
The role of local steps in local SGD Optim. Methods Softw. (IF 2.2) Pub Date : 2023-08-07 Tiancheng Qin, S. Rasoul Etesami, César A. Uribe
We consider the distributed stochastic optimization problem where n agents want to minimize a global function given by the sum of agents' local functions and focus on the heterogeneous setting when agents' local functions are defined over non-i.i.d. datasets. We study the Local SGD method, where agents perform a number of local stochastic gradient steps and occasionally communicate with a central node
-
An equivalent nonlinear optimization model with triangular low-rank factorization for semidefinite programs Optim. Methods Softw. (IF 2.2) Pub Date : 2023-07-10 Yuya Yamakawa, Tetsuya Ikegami, Ellen H. Fukuda, Nobuo Yamashita
In this paper, we propose a new nonlinear optimization model to solve semidefinite optimization problems (SDPs), providing some properties related to local optimal solutions. The proposed model is ...
-
Sparse convex optimization toolkit: a mixed-integer framework Optim. Methods Softw. (IF 2.2) Pub Date : 2023-07-10 Alireza Olama, Eduardo Camponogara, Jan Kronqvist
This paper proposes an open-source distributed solver for solving Sparse Convex Optimization (SCO) problems over computational networks. Motivated by past algorithmic advances in mixed-integer opti...
-
An equivalent nonlinear optimization model with triangular low-rank factorization for semidefinite programs Optim. Methods Softw. (IF 2.2) Pub Date : 2023-07-10 Yuya Yamakawa, Tetsuya Ikegami, Ellen H. Fukuda, Nobuo Yamashita
In this paper, we propose a new nonlinear optimization model to solve semidefinite optimization problems (SDPs), providing some properties related to local optimal solutions. The proposed model is ...