当前位置: X-MOL 学术J. Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Incremental DC optimization algorithm for large-scale clusterwise linear regression
Journal of Computational and Applied Mathematics ( IF 2.1 ) Pub Date : 2020-12-29 , DOI: 10.1016/j.cam.2020.113323
Adil M. Bagirov , Sona Taheri , Emre Cimen

The objective function in the nonsmooth optimization model of the clusterwise linear regression (CLR) problem with the squared regression error is represented as a difference of two convex functions. Then using the difference of convex algorithm (DCA) approach the CLR problem is replaced by the sequence of smooth unconstrained optimization subproblems. A new algorithm based on the DCA and the incremental approach is designed to solve the CLR problem. We apply the Quasi-Newton method to solve the subproblems. The proposed algorithm is evaluated using several synthetic and real-world data sets for regression and compared with other algorithms for CLR. Results demonstrate that the DCA based algorithm is efficient for solving CLR problems with the large number of data points and in particular, outperforms other algorithms when the number of input variables is small.



中文翻译:

大规模簇线性回归的增量式DC优化算法

聚类线性回归(CLR)问题的非光滑优化模型中具有平方回归误差的目标函数表示为两个凸函数之差。然后,使用凸算法(DCA)的差异方法,将CLR问题替换为平滑无约束优化子问题的序列。设计了一种基于DCA和增量方法的新算法来解决CLR问题。我们应用拟牛顿法来解决子问题。使用几种合成的和真实的数据集对提出的算法进行评估,以进行回归,并与其他CLR算法进行比较。结果表明,基于DCA的算法可有效解决具有大量数据点的CLR问题,特别是,

更新日期:2021-01-12
down
wechat
bug