当前位置: X-MOL 学术Adv. Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning via variably scaled kernels
Advances in Computational Mathematics ( IF 1.7 ) Pub Date : 2021-06-26 , DOI: 10.1007/s10444-021-09875-6
C Campi 1 , F Marchetti 2 , E Perracchione 1
Affiliation  

We investigate the use of the so-called variably scaled kernels (VSKs) for learning tasks, with a particular focus on support vector machine (SVM) classifiers and kernel regression networks (KRNs). Concerning the kernels used to train the models, under appropriate assumptions, the VSKs turn out to be more expressive and more stable than the standard ones. Numerical experiments and applications to breast cancer and coronavirus disease 2019 (COVID-19) data support our claims. For the practical implementation of the VSK setting, we need to select a suitable scaling function. To this aim, we propose different choices, including for SVMs a probabilistic approach based on the naive Bayes (NB) classifier. For the classification task, we also numerically show that the VSKs inspire an alternative scheme to the sometimes computationally demanding feature extraction procedures.



中文翻译:

通过可变缩放内核学习

我们研究了所谓的可变缩放内核 (VSK) 在学习任务中的使用,特别关注支持向量机 (SVM) 分类器和内核回归网络 (KRN)。关于用于训练模型的内核,在适当的假设下,VSK比标准的更具表现力更稳定。2019 年乳腺癌和冠状病毒病 (COVID-19) 数据的数值实验和应用支持我们的主张。对于 VSK 设置的实际实现,我们需要选择一个合适的缩放函数. 为此,我们提出了不同的选择,包括针对 SVM 的基于朴素贝叶斯 (NB) 分类器的概率方法。对于分类任务,我们还在数值上表明 VSK 启发了一种替代方案,以替代有时需要计算的特征提取程序。

更新日期:2021-06-28
down
wechat
bug