当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning Multiple Parameters for Kernel Collaborative Representation Classification
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.2 ) Pub Date : 2020-01-17 , DOI: 10.1109/tnnls.2019.2962878
Jianjun Liu , Zebin Wu , Liang Xiao , Hong Yan

In this article, the problem of automatically learning multiple parameters for kernel collaborative representation classification (KCRC) is considered. We investigate the KCRC and measure its generalization error via leave-one-out cross-validation (LOO-CV). By taking advantage of the specific properties of KCRC, a closed-form expression is derived for the outputs of LOO-CV. Then, a simple classification rule that provides probabilistic outputs is adopted, and thereby, an effective loss function that is an explicit function with respect to the parameters is proposed as the generalization error. The gradients of the loss function are calculated, and the parameters are learned by minimizing the loss function using a gradient-based optimization algorithm. Furthermore, the proposed approach makes it possible to solve the multiple kernel/feature learning problems of KCRC effectively. Experiment results on six data sets taken from different scenes demonstrate the effectiveness of the proposed approach.

中文翻译:


学习内核协作表示分类的多个参数



在本文中,考虑了核协作表示分类(KCRC)的自动学习多个参数的问题。我们研究了九铁并通过留一交叉验证(LOO-CV)测量其泛化误差。通过利用 KCRC 的特定属性,为 LOO-CV 的输出导出了一个封闭式表达式。然后,采用提供概率输出的简单分类规则,从而提出作为关于参数的显式函数的有效损失函数作为泛化误差。计算损失函数的梯度,并使用基于梯度的优化算法通过最小化损失函数来学习参数。此外,所提出的方法可以有效解决 KCRC 的多核/特征学习问题。来自不同场景的六个数据集的实验结果证明了该方法的有效性。
更新日期:2020-01-17
down
wechat
bug