当前位置: X-MOL 学术J. Complex. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-task Learning in vector-valued reproducing kernel Banach spaces with the ℓ1 norm
Journal of Complexity ( IF 1.7 ) Pub Date : 2020-09-03 , DOI: 10.1016/j.jco.2020.101514
Rongrong Lin , Guohui Song , Haizhang Zhang

Targeting at sparse multi-task learning, we consider regularization models with an 1 penalty on the coefficients of kernel functions. In order to provide a kernel method for this model, we construct a class of vector-valued reproducing kernel Banach spaces with the 1 norm. The notion of multi-task admissible kernels is proposed so that the constructed spaces could have desirable properties including the crucial linear representer theorem. Such kernels are related to bounded Lebesgue constants of a kernel interpolation question. We study the Lebesgue constant of multi-task kernels and provide examples of admissible kernels. Furthermore, we present numerical experiments for both synthetic data and real-world benchmark data to demonstrate the advantages of the proposed construction and regularization models.



中文翻译:

向量值重现内核Banach空间中的多任务学习 1个 规范

针对稀疏的多任务学习,我们考虑将正则化模型与 1个核函数系数的损失。为了提供此模型的内核方法,我们构造了一类矢量值的再现内核Banach空间,其中1个规范。提出了多任务可允许核的概念,以便构造的空间可以具有理想的属性,包括关键的线性表示定理。这样的内核与内核插值问题的有界Lebesgue常数有关。我们研究了多任务内核的Lebesgue常数,并提供了可接受的内核的示例。此外,我们目前对合成数据和实际基准数据进行数值实验,以证明所提出的构造和正则化模型的优势。

更新日期:2020-09-03
down
wechat
bug