当前位置:
X-MOL 学术
›
arXiv.cs.CC
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Improved Quantum Boosting
arXiv - CS - Computational Complexity Pub Date : 2020-09-17 , DOI: arxiv-2009.08360 Adam Izdebski and Ronald de Wolf
arXiv - CS - Computational Complexity Pub Date : 2020-09-17 , DOI: arxiv-2009.08360 Adam Izdebski and Ronald de Wolf
Boosting is a general method to convert a weak learner (which generates
hypotheses that are just slightly better than random) into a strong learner
(which generates hypotheses that are much better than random). Recently,
Arunachalam and Maity gave the first quantum improvement for boosting, by
combining Freund and Schapire's AdaBoost algorithm with a quantum algorithm for
approximate counting. Their booster is faster than classical boosting as a
function of the VC-dimension of the weak learner's hypothesis class, but worse
as a function of the quality of the weak learner. In this paper we give a
substantially faster and simpler quantum boosting algorithm, based on
Servedio's SmoothBoost algorithm.
中文翻译:
改进的量子提升
Boosting 是一种将弱学习器(生成比随机稍好一点的假设)转换为强学习器(生成比随机好得多的假设)的通用方法。最近,Arunachalam 和 Maity 通过将 Freund 和 Schapire 的 AdaBoost 算法与用于近似计数的量子算法相结合,对提升进行了第一个量子改进。作为弱学习器假设类的 VC 维度的函数,他们的 booster 比经典 boosting 更快,但作为弱学习器质量的函数更差。在本文中,我们基于 Servedio 的 SmoothBoost 算法给出了一种更快更简单的量子提升算法。
更新日期:2020-09-18
中文翻译:
改进的量子提升
Boosting 是一种将弱学习器(生成比随机稍好一点的假设)转换为强学习器(生成比随机好得多的假设)的通用方法。最近,Arunachalam 和 Maity 通过将 Freund 和 Schapire 的 AdaBoost 算法与用于近似计数的量子算法相结合,对提升进行了第一个量子改进。作为弱学习器假设类的 VC 维度的函数,他们的 booster 比经典 boosting 更快,但作为弱学习器质量的函数更差。在本文中,我们基于 Servedio 的 SmoothBoost 算法给出了一种更快更简单的量子提升算法。