Machine Learning ( IF 2.809 ) Pub Date : 2019-06-18 , DOI: 10.1007/s10994-019-05814-1
Eric Bax, Lingjie Weng, Xu Tian

Abstract We introduce the speculate-correct method to derive error bounds for local classifiers. Using it, we show that k-nearest neighbor classifiers, in spite of their famously fractured decision boundaries, have exponential error bounds with $$\hbox {O} \left( \sqrt{(k + \ln n)/n} \right)$$ range around an estimate of generalization error for n in-sample examples.

down
wechat
bug