当前位置: X-MOL 学术J. Syst. Eng. Electron. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking
Journal of Systems Engineering and Electronics ( IF 1.9 ) Pub Date : 2021-03-03 , DOI: 10.23919/jsee.2021.000014
Hu Lei , Yi Guoxing , Huang Chao

Least square support vector regression (LSSVR) is a method for function approximation, whose solutions are typically non-sparse, which limits its application especially in some occasions of fast prediction. In this paper, a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking (GRPR-AP-LSSVR) is proposed. At first, the global representative point ranking (GRPR) algorithm is given, and relevant data analysis experiment is implemented which depicts the importance ranking of data points. Furthermore, the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity. The removed data points are utilized to test the temporary learning model which ensures the regression accuracy. Finally, the proposed algorithm is verified on artificial datasets and UCI regression datasets, and experimental results indicate that, compared with several benchmark algorithms, the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance.

中文翻译:

基于全局代表点排序的自适应修剪最小二乘支持向量回归机的稀疏算法

最小二乘支持向量回归(LSSVR)是一种函数逼近的方法,其解决方案通常是非稀疏的,这限制了它的应用,尤其是在某些快速预测的情况下。提出了一种基于全局代表点排序的稀疏自适应修剪LSSVR算法(GRPR-AP-LSSVR)。首先,给出了全局代表点排名算法,并进行了描述数据点重要性排名的相关数据分析实验。此外,在递减学习过程中删除两个样本的修剪策略旨在加快训练速度并确保稀疏性。删除的数据点用于测试临时学习模型,以确保回归准确性。最后,
更新日期:2021-03-05
down
wechat
bug