当前位置: X-MOL 学术Ann. Stat. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Transfer learning for nonparametric classification: Minimax rate and adaptive classifier
Annals of Statistics ( IF 3.2 ) Pub Date : 2021-01-29 , DOI: 10.1214/20-aos1949
T. Tony Cai , Hongji Wei

Human learners have the natural ability to use knowledge gained in one setting for learning in a different but related setting. This ability to transfer knowledge from one task to another is essential for effective learning. In this paper, we study transfer learning in the context of nonparametric classification based on observations from different distributions under the posterior drift model, which is a general framework and arises in many practical problems. We first establish the minimax rate of convergence and construct a rate-optimal two-sample weighted $K$-NN classifier. The results characterize precisely the contribution of the observations from the source distribution to the classification task under the target distribution. A data-driven adaptive classifier is then proposed and is shown to simultaneously attain within a logarithmic factor of the optimal rate over a large collection of parameter spaces. Simulation studies and real data applications are carried out where the numerical results further illustrate the theoretical analysis. Extensions to the case of multiple source distributions are also considered.

中文翻译:

非参数分类的转移学习:最小最大速率和自适应分类器

人类学习者具有利用一种环境中获得的知识在不同但相关的环境中进行学习的天生能力。这种将知识从一项任务转移到另一项任务的能力对于有效学习至关重要。在本文中,我们基于后漂移模型下的来自不同分布的观察结果,研究了非参数分类情况下的迁移学习,这是一个普遍的框架,并引起许多实际问题。我们首先建立最小最大收敛速率,并构造一个速率最优的两样本加权$ K $ -NN分类器。结果精确地表征了从源分布到目标分布下分类任务的观测贡献。然后提出了一种数据驱动的自适应分类器,该分类器被显示为在大量参数空间的集合中同时达到最佳速率的对数因子。进行了仿真研究和实际数据应用,其中数值结果进一步说明了理论分析。还考虑了对多个源分布情况的扩展。
更新日期:2021-01-29
down
wechat
bug