当前位置: X-MOL 学术IEEE Trans. Veh. Technol. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Error-Tolerant Computation for Voting Classifiers with Multiple Classes
IEEE Transactions on Vehicular Technology ( IF 6.8 ) Pub Date : 2020-11-01 , DOI: 10.1109/tvt.2020.3025739
Shanshan Liu , Pedro Reviriego , Paolo Montuschi , Fabrizio Lombardi

In supervised learning, labeled data are provided as inputs and then learning is used to classify new observations. Error tolerance should be guaranteed for classifiers when they are employed in critical applications. A widely used type of classifiers is based on voting among instances (referred to as single voter classifiers) or multiple voters (referred to as ensemble classifiers). When the classifiers are implemented on a processor, Time-Based Modular Redundancy (TBMR) techniques are often used for protection due to the inflexibility of the hardware. In TBMR, any single error can be handled at the cost of additional computing either once for detection or twice for correction after detection; however, this technique increases the computation overhead by at least 100%. The Voting Margin (VM) scheme has recently been proposed to reduce the computation overhead of TBMR, but this scheme has only been utilized for k Nearest Neighbors (kNNs) classifiers with two classes. In this paper, the VM scheme is extended to multiple classes, as well as other voting classifiers by exploiting the intrinsic robustness of the algorithms. kNNs (that is a single voter classifier) and Random Forest (RF) (that is an ensemble classifier) are considered to evaluate the proposed scheme. Using multiple datasets, the results show that the proposed scheme significantly reduces the computation overhead by more than 70% for kNNs with good classification accuracy and by more than 90% for RF in all cases. However, when extended to multiple classes, the VM scheme for kNNs is not efficient for some datasets. In this paper, a new protection scheme referred to as k + 1 NNs is presented as an alternative option to provide efficient protection in those scenarios. In the new scheme, the computation overhead can be further reduced at the cost of allowing a very low percentage of errors that can modify the classification outcome.

中文翻译:

多类投票分类器的容错计算

在监督学习中,标记数据作为输入提供,然后学习用于对新观察进行分类。当分类器用于关键应用时,应保证分类器的容错性。一种广泛使用的分类器类型是基于实例之间的投票(称为单一投票者分类器)或多个投票者(称为集成分类器)。当分类器在处理器上实现时,由于硬件的不灵活性,基于时间的模块化冗余 (TBMR) 技术通常用于保护。在TBMR中,任何单个错误都可以以额外的计算为代价来处理,要么一次检测,要么两次检测后纠正;然而,这种技术至少增加了 100% 的计算开销。最近提出了投票余量 (VM) 方案以减少 TBMR 的计算开销,但该方案仅用于 k 个具有两个类别的最近邻 (kNN) 分类器。在本文中,通过利用算法的内在鲁棒性,将 VM 方案扩展到多个类以及其他投票分类器。kNNs(即单选民分类器)和随机森林(RF)(即集成分类器)被认为是评估所提出的方案。使用多个数据集,结果表明,对于具有良好分类精度的 kNN,所提出的方案显着降低了 70% 以上的计算开销,并且在所有情况下对 RF 降低了 90% 以上。然而,当扩展到多个类时,kNN 的 VM 方案对于某些数据集效率不高。在本文中,提出了一种称为 k + 1 NNs 的新保护方案,作为在这些情况下提供有效保护的替代选项。在新方案中,可以进一步减少计算开销,代价是允许可以修改分类结果的错误百分比非常低。
更新日期:2020-11-01
down
wechat
bug