Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Modernizing k-nearest neighbors
Stat ( IF 0.7 ) Pub Date : 2020-11-23 , DOI: 10.1002/sta4.335 Robin Elizabeth Yancey 1 , Bochao Xin 1 , Norm Matloff 1
Stat ( IF 0.7 ) Pub Date : 2020-11-23 , DOI: 10.1002/sta4.335 Robin Elizabeth Yancey 1 , Bochao Xin 1 , Norm Matloff 1
Affiliation
The k-nearest neighbors (k-NN) method is one of the oldest statistical/machine learning techniques. It is included in virtually every major package, such as caret, parsnip, mlr3 and scikit-learn. Yet those packages do not go beyond the basics. With today's high-speed computation capability, k-NN can be made much more powerful. Here, we present directions in which that can be done.
中文翻译:
现代化 k 最近邻
该k-最近邻(K-NN)方法是最古老的统计/机器学习技术之一。它几乎包含在每个主要软件包中,例如caret、parsnip、mlr3和scikit-learn。然而,这些包并没有超越基础。凭借当今的高速计算能力,k-NN 可以变得更加强大。在这里,我们提出了可以做到的方向。
更新日期:2020-11-23
中文翻译:
现代化 k 最近邻
该k-最近邻(K-NN)方法是最古老的统计/机器学习技术之一。它几乎包含在每个主要软件包中,例如caret、parsnip、mlr3和scikit-learn。然而,这些包并没有超越基础。凭借当今的高速计算能力,k-NN 可以变得更加强大。在这里,我们提出了可以做到的方向。