当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic DCA for minimizing a large sum of DC functions with application to multi-class logistic regression.
Neural Networks ( IF 7.8 ) Pub Date : 2020-09-02 , DOI: 10.1016/j.neunet.2020.08.024
Hoai An Le Thi 1 , Hoai Minh Le 2 , Duy Nhat Phan 2 , Bach Tran 2
Affiliation  

We consider the large sum of DC (Difference of Convex) functions minimization problem which appear in several different areas, especially in stochastic optimization and machine learning. Two DCA (DC Algorithm) based algorithms are proposed: stochastic DCA and inexact stochastic DCA. We prove that the convergence of both algorithms to a critical point is guaranteed with probability one. Furthermore, we develop our stochastic DCA for solving an important problem in multi-task learning, namely group variables selection in multi class logistic regression. The corresponding stochastic DCA is very inexpensive, all computations are explicit. Numerical experiments on several benchmark datasets and synthetic datasets illustrate the efficiency of our algorithms and their superiority over existing methods, with respect to classification accuracy, sparsity of solution as well as running time.



中文翻译:

随机DCA,用于将大量DC函数最小化,并应用于多类Logistic回归。

我们考虑了在不同领域中出现的大量DC(凸差)函数最小化问题,尤其是在随机优化和机器学习中。提出了两种基于DCA(DC算法)的算法:随机DCA和不精确的随机DCA。我们证明了两种算法收敛到临界点的概率都是1。此外,我们开发了随机DCA,用于解决多任务学习中的一个重要问题,即多类逻辑回归中的组变量选择。相应的随机DCA非常便宜,所有计算都是明确的。在几个基准数据集和综合数据集上进行的数值实验说明了我们的算法的效率及其相对于现有方法的优越性(在分类准确度,

更新日期:2020-09-10
down
wechat
bug