当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Prudence when assuming normality: An advice for machine learning practitioners
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2020-06-30 , DOI: 10.1016/j.patrec.2020.06.026
Waleed A. Yousef

In a binary classification problem the feature vector (predictor) is the input to a scoring function that produces a decision value (score), which is compared to a particular chosen threshold to provide a final class prediction (output). Although the normal assumption of the scoring function is important in many applications, sometimes it is severely violated even under the simple multinormal assumption of the feature vector. This article proves this result mathematically with a counterexample to provide an advice for practitioners to avoid blind assumptions of normality. On the other hand, the article provides a set of experiments that illustrate some of the expected and well-behaved results of the Area Under the ROC curve (AUC) under the multinormal assumption of the feature vector. Therefore, the message of the article is not to avoid the normal assumption of either the input feature vector or the output scoring function; however, a prudence is needed when adopting either of both.



中文翻译:

假设正常时要谨慎:对机器学习从业者的建议

在二进制分类问题中,特征向量(预测变量)是评分函数的输入,该评分函数会生成决策值(得分),将其与特定选择的阈值进行比较以提供最终的类别预测(输出)。尽管评分函数的正常假设在许多应用中都很重要,但有时甚至在特征向量的简单多正规假设下也遭到严重违反。本文用一个反例从数学上证明了这一结果,为从业人员提供了一些建议,以避免对正态性的盲目假设。另一方面,本文提供了一组实验,这些实验说明了在特征向量的多正规假设下,ROC曲线下面积(AUC)的一些预期结果和行为良好的结果。因此,该文章的信息不是要避免对输入特征向量或输出评分函数进行正常假设;但是,同时采用两者时都需要谨慎。

更新日期:2020-07-06
down
wechat
bug