当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Smoothing neural network for L0 regularized optimization problem with general convex constraints
Neural Networks ( IF 7.8 ) Pub Date : 2021-08-08 , DOI: 10.1016/j.neunet.2021.08.001
Wenjing Li 1 , Wei Bian 2
Affiliation  

In this paper, we propose a neural network modeled by a differential inclusion to solve a class of discontinuous and nonconvex sparse regression problems with general convex constraints, whose objective function is the sum of a convex but not necessarily differentiable loss function and L0 regularization. We construct a smoothing relaxation function of L0 regularization and propose a neural network to solve the considered problem. We prove that the solution of proposed neural network with any initial point satisfying linear equality constraints is global existent, bounded and reaches the feasible region in finite time and remains there thereafter. Moreover, the solution of proposed neural network is its slow solution and any accumulation point of it is a Clarke stationary point of the brought forward nonconvex smoothing approximation problem. In the box-constrained case, all accumulation points of the solution own a unified lower bound property and have a common support set. Except for a special case, any accumulation point of the solution is a local minimizer of the considered problem. In particular, the proposed neural network has a simple structure than most existing neural networks for solving the locally Lipschitz continuous but nonsmooth nonconvex problems. Finally, we give some numerical experiments to show the efficiency of proposed neural network.



中文翻译:

具有一般凸约束的 L0 正则化优化问题的平滑神经网络

在本文中,我们提出了一种通过差分包含建模的神经网络来解决一类具有一般凸约束的不连续和非凸稀疏回归问题,其目标函数是凸但不一定可微的损失函数和 0正则化。我们构建了一个平滑松弛函数0正则化并提出一个神经网络来解决所考虑的问题。我们证明了所提出的具有任何满足线性等式约束的初始点的神经网络的解是全局存在的、有界的并且在有限时间内到达可行域并在此之后保持在那里。此外,所提出的神经网络的解是它的慢解,它的任何积累点都是提出的非凸平滑逼近问题的克拉克驻点。在框约束的情况下,解的所有累积点都具有统一的下界属性并具有共同的支持集。除特殊情况外,解的任何累积点都是所考虑问题的局部极小值。特别是,所提出的神经网络具有比大多数现有神经网络更简单的结构,用于解决局部 Lipschitz 连续但非光滑非凸问题。最后,我们给出了一些数值实验来证明所提出的神经网络的效率。

更新日期:2021-08-15
down
wechat
bug