当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines.
IEEE Transactions on Cybernetics ( IF 11.8 ) Pub Date : 2020-01-01 , DOI: 10.1109/tcyb.2020.3008963
Guanjin Wang , Kup-Sze Choi , Jeremy Yuen-Chun Teoh , Jie Lu

This article presents a new deep cross-output knowledge transfer approach based on least-squares support vector machines, called DCOT-LS-SVMs. Its aim is to improve the generalizability of least-squares support vector machines (LS-SVMs) while avoiding the complicated parameter tuning process that occurs in many kernel machines. The proposed approach has two significant characteristics: 1) DCOT-LS-SVMs is inspired by a stacked hierarchical architecture that combines several layer-by-layer LS-SVMs modules. The module that forms the higher layer has additional input features that consider the predictions from all previous modules and 2) cross-output knowledge transfer is used to leverage knowledge from the predictions of the previous module to improve the learning process in the current module. With this approach, the model's parameters, such as a tradeoff parameter C and a kernel width δ, can be randomly assigned to each module in order to greatly simplify the learning process. Moreover, DCOT-LS-SVMs is able to autonomously and quickly decide the extent of the cross-output knowledge transfer between adjacent modules through a fast leave-one-out cross-validation strategy. In addition, we present an imbalanced version of DCOT-LS-SVMs, called IDCOT-LS-SVMs, given that imbalanced datasets are common in real-world scenarios. The effectiveness of the proposed approaches is demonstrated through a comparison with five comparative methods on UCI datasets and with a case study on the diagnosis of prostate cancer.

中文翻译:

使用堆叠结构最小二乘支持向量机的深度交叉输出知识转移。

本文提出了一种新的基于最小二乘支持向量机的深度交叉输出知识转移方法,称为 DCOT-LS-SVM。其目的是提高最小二乘支持向量机 (LS-SVM) 的泛化性,同时避免许多内核机器中出现的复杂参数调整过程。所提出的方法有两个显着特点:1)DCOT-LS-SVMs 的灵感来自堆叠的分层架构,该架构结合了几个逐层的 LS-SVMs 模块。形成较高层的模块具有考虑所有先前模块的预测的附加输入特征,并且 2)交叉输出知识转移用于利用来自先前模块的预测的知识来改进当前模块中的学习过程。使用这种方法,模型的参数,例如权衡参数C和内核宽度δ,可以随机分配给每个模块,以大大简化学习过程。此外,DCOT-LS-SVM 能够通过快速留一法交叉验证策略自主快速地决定相邻模块之间交叉输出知识转移的程度。此外,鉴于不平衡数据集在现实世界场景中很常见,我们提出了 DCOT-LS-SVM 的不平衡版本,称为 IDCOT-LS-SVM。通过与 UCI 数据集上的五种比较方法和前列腺癌诊断的案例研究进行比较,证明了所提出方法的有效性。DCOT-LS-SVMs 能够通过快速留一法交叉验证策略自主快速地决定相邻模块之间交叉输出知识转移的程度。此外,鉴于不平衡数据集在现实世界场景中很常见,我们提出了 DCOT-LS-SVM 的不平衡版本,称为 IDCOT-LS-SVM。通过与 UCI 数据集上的五种比较方法和前列腺癌诊断的案例研究进行比较,证明了所提出方法的有效性。DCOT-LS-SVMs 能够通过快速留一法交叉验证策略自主快速地决定相邻模块之间交叉输出知识转移的程度。此外,鉴于不平衡数据集在现实世界场景中很常见,我们提出了 DCOT-LS-SVM 的不平衡版本,称为 IDCOT-LS-SVM。通过与 UCI 数据集上的五种比较方法和前列腺癌诊断的案例研究进行比较,证明了所提出方法的有效性。
更新日期:2020-01-01
down
wechat
bug