当前位置:
X-MOL 学术
›
arXiv.cs.LG
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
K-NN active learning under local smoothness assumption
arXiv - CS - Machine Learning Pub Date : 2020-01-17 , DOI: arxiv-2001.06485 Boris Ndjia Njike, Xavier Siebert
arXiv - CS - Machine Learning Pub Date : 2020-01-17 , DOI: arxiv-2001.06485 Boris Ndjia Njike, Xavier Siebert
There is a large body of work on convergence rates either in passive or
active learning. Here we first outline some of the main results that have been
obtained, more specifically in a nonparametric setting under assumptions about
the smoothness of the regression function (or the boundary between classes) and
the margin noise. We discuss the relative merits of these underlying
assumptions by putting active learning in perspective with recent work on
passive learning. We design an active learning algorithm with a rate of
convergence better than in passive learning, using a particular smoothness
assumption customized for k-nearest neighbors. Unlike previous active learning
algorithms, we use a smoothness assumption that provides a dependence on the
marginal distribution of the instance space. Additionally, our algorithm avoids
the strong density assumption that supposes the existence of the density
function of the marginal distribution of the instance space and is therefore
more generally applicable.
中文翻译:
局部平滑假设下的 K-NN 主动学习
在被动或主动学习中,有大量关于收敛率的工作。在这里,我们首先概述已经获得的一些主要结果,更具体地说是在关于回归函数(或类之间的边界)的平滑度和边缘噪声的假设下的非参数设置中。我们通过将主动学习与最近关于被动学习的工作相结合来讨论这些基本假设的相对优点。我们设计了一种主动学习算法,其收敛速度优于被动学习,使用为 k 最近邻定制的特定平滑度假设。与以前的主动学习算法不同,我们使用平滑度假设,该假设提供对实例空间边缘分布的依赖。此外,
更新日期:2020-07-14
中文翻译:
局部平滑假设下的 K-NN 主动学习
在被动或主动学习中,有大量关于收敛率的工作。在这里,我们首先概述已经获得的一些主要结果,更具体地说是在关于回归函数(或类之间的边界)的平滑度和边缘噪声的假设下的非参数设置中。我们通过将主动学习与最近关于被动学习的工作相结合来讨论这些基本假设的相对优点。我们设计了一种主动学习算法,其收敛速度优于被动学习,使用为 k 最近邻定制的特定平滑度假设。与以前的主动学习算法不同,我们使用平滑度假设,该假设提供对实例空间边缘分布的依赖。此外,