当前位置: X-MOL 学术Theor. Comput. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Boosting over non-deterministic ZDDs
Theoretical Computer Science ( IF 0.9 ) Pub Date : 2018-12-04 , DOI: 10.1016/j.tcs.2018.11.027
Takahiro Fujita , Kohei Hatano , Eiji Takimoto

We propose a new approach to large-scale machine learning, learning over compressed data: First compress the training data somehow and then employ various machine learning algorithms on the compressed data, with the hope that the computation time is significantly reduced when the training data is well compressed. As a first step toward this approach, we consider a variant of the Zero-Suppressed Binary Decision Diagram (ZDD) as the data structure for representing the training data, which is a generalization of the ZDD by incorporating non-determinism. For the learning algorithm to be employed, we consider a boosting algorithm called AdaBoost⁎ and its precursor AdaBoost. In this paper, we give efficient implementations of the boosting algorithms whose running times (per iteration) are linear in the size of the given ZDD.



中文翻译:

提升不确定性ZDD

我们提出了一种用于大规模机器学习的新方法,即通过压缩数据进行学习:首先以某种方式压缩训练数据,然后对压缩后的数据采用各种机器学习算法,希望当训练数据得到很好的压缩后,可大大减少计算时间。作为迈向这种方法的第一步,我们将零抑制二进制决策图(ZDD)的一种变体作为代表训练数据的数据结构,这是通过结合非确定性对ZDD的概括。对于要使用的学习算法,我们考虑一种称为AdaBoost⁎的增强算法及其前身AdaBoost。在本文中,我们给出了提升算法的有效实现,该算法的运行时间(每次迭代)在给定ZDD的大小上是线性的。

更新日期:2018-12-04
down
wechat
bug