当前位置: X-MOL 学术arXiv.cs.DM › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fourier Growth of Parity Decision Trees
arXiv - CS - Discrete Mathematics Pub Date : 2021-03-22 , DOI: arxiv-2103.11604
Uma Girish, Avishay Tal, Kewen Wu

We prove that for every parity decision tree of depth $d$ on $n$ variables, the sum of absolute values of Fourier coefficients at level $\ell$ is at most $d^{\ell/2} \cdot O(\ell \cdot \log(n))^\ell$. Our result is nearly tight for small values of $\ell$ and extends a previous Fourier bound for standard decision trees by Sherstov, Storozhenko, and Wu (STOC, 2021). As an application of our Fourier bounds, using the results of Bansal and Sinha (STOC, 2021), we show that the $k$-fold Forrelation problem has (randomized) parity decision tree complexity $\tilde{\Omega}\left(n^{1-1/k}\right)$, while having quantum query complexity $\lceil k/2\rceil$. Our proof follows a random-walk approach, analyzing the contribution of a random path in the decision tree to the level-$\ell$ Fourier expression. To carry the argument, we apply a careful cleanup procedure to the parity decision tree, ensuring that the value of the random walk is bounded with high probability. We observe that step sizes for the level-$\ell$ walks can be computed by the intermediate values of level $\le \ell-1$ walks, which calls for an inductive argument. Our approach differs from previous proofs of Tal (FOCS, 2020) and Sherstov, Storozhenko, and Wu (STOC, 2021) that relied on decompositions of the tree. In particular, for the special case of standard decision trees we view our proof as slightly simpler and more intuitive. In addition, we prove a similar bound for noisy decision trees of cost at most $d$ -- a model that was recently introduced by Ben-David and Blais (FOCS, 2020).

中文翻译:

奇偶决策树的傅立叶增长

我们证明,对于$ n $变量上深度为$ d $的每个奇偶决策树,级别为\\ ell $的傅立叶系数的绝对值的总和最多为$ d ^ {\ ell / 2} \ cdot O(\ ell \ cdot \ log(n))^ \ ell $。对于$ \ ell $的小值,我们的结果几乎严格,并扩展了Sherstov,Storozhenko和Wu(STOC,2021)对标准决策树的先前傅立叶界。作为应用傅立叶范围的工具,使用Bansal和Sinha(STOC,2021)的结果,我们证明,$ k $倍的Forrelation问题具有(随机化)奇偶决策树复杂度$ \ tilde {\ Omega} \ left( n ^ {1-1 / k} \ right)$,同时具有量子查询复杂度$ \ lceil k / 2 \ rceil $。我们的证明遵循随机游走方法,分析了决策树中的随机路径对傅立叶水平表达的贡献。为了论证,我们对奇偶校验决策树应用了仔细的清理过程,以确保随机游走的值具有较高的概率。我们观察到,级别$ \ ell $ walk的步长可以通过级别$ \ le \ ell-1 $ walk的中间值来计算,这需要归纳参数。我们的方法不同于以前的Tal(FOCS,2020)和Sherstov,Storozhenko和Wu(STOC,2021)的证明,后者依赖于树的分解。特别是,对于标准决策树的特殊情况,我们认为我们的证明稍微更简单,更直观。此外,我们证明了成本最多为d $的嘈杂决策树的相似界限-该模型最近由Ben-David和Blais提出(FOCS,2020)。我们观察到,级别$ \ ell $ walk的步长可以通过级别$ \ le \ ell-1 $ walk的中间值来计算,这需要归纳参数。我们的方法不同于以前的Tal(FOCS,2020)和Sherstov,Storozhenko和Wu(STOC,2021)的证明,后者依赖于树的分解。特别是,对于标准决策树的特殊情况,我们认为我们的证明稍微更简单,更直观。此外,我们证明了成本最多为d $的嘈杂决策树的相似界限-该模型最近由Ben-David和Blais提出(FOCS,2020)。我们观察到,级别$ \ ell $ walk的步长可以通过级别$ \ le \ ell-1 $ walk的中间值来计算,这需要归纳参数。我们的方法不同于以前的Tal(FOCS,2020)和Sherstov,Storozhenko和Wu(STOC,2021)的证明,后者依赖于树的分解。特别是,对于标准决策树的特殊情况,我们认为我们的证明稍微更简单,更直观。此外,我们证明了成本最多为d $的嘈杂决策树的相似界限-该模型最近由Ben-David和Blais提出(FOCS,2020)。2021)依赖于树的分解。特别是,对于标准决策树的特殊情况,我们认为我们的证明稍微更简单,更直观。此外,我们证明了成本最多为d $的嘈杂决策树的相似界限-该模型最近由Ben-David和Blais提出(FOCS,2020)。2021)依赖于树的分解。特别是,对于标准决策树的特殊情况,我们认为我们的证明稍微更简单,更直观。此外,我们证明了成本最多为d $的嘈杂决策树的相似界限-该模型最近由Ben-David和Blais提出(FOCS,2020)。
更新日期:2021-03-23
down
wechat
bug