当前位置: X-MOL 学术arXiv.cs.DS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Statistical Query Lower Bounds for List-Decodable Linear Regression
arXiv - CS - Data Structures and Algorithms Pub Date : 2021-06-17 , DOI: arxiv-2106.09689
Ilias Diakonikolas, Daniel M. Kane, Ankit Pensia, Thanasis Pittas, Alistair Stewart

We study the problem of list-decodable linear regression, where an adversary can corrupt a majority of the examples. Specifically, we are given a set $T$ of labeled examples $(x, y) \in \mathbb{R}^d \times \mathbb{R}$ and a parameter $0< \alpha <1/2$ such that an $\alpha$-fraction of the points in $T$ are i.i.d. samples from a linear regression model with Gaussian covariates, and the remaining $(1-\alpha)$-fraction of the points are drawn from an arbitrary noise distribution. The goal is to output a small list of hypothesis vectors such that at least one of them is close to the target regression vector. Our main result is a Statistical Query (SQ) lower bound of $d^{\mathrm{poly}(1/\alpha)}$ for this problem. Our SQ lower bound qualitatively matches the performance of previously developed algorithms, providing evidence that current upper bounds for this task are nearly best possible.

中文翻译:

列表可解码线性回归的统计查询下限

我们研究了列表可解码线性回归的问题,其中对手可以破坏大多数示例。具体来说,我们给定了一组标记示例 $(x, y) \in \mathbb{R}^d \times \mathbb{R}$ 和一个参数 $0< \alpha <1/2$ 使得$T$ 中点的 $\alpha$-fraction 是来自具有高斯协变量的线性回归模型的 iid 样本,其余的 $(1-\alpha)$-fraction 从任意噪声分布中抽取。目标是输出一小组假设向量,使得其中至少一个接近目标回归向量。我们的主要结果是此问题的 $d^{\mathrm{poly}(1/\alpha)}$ 的统计查询 (SQ) 下界。我们的 SQ 下界定性地匹配先前开发的算法的性能,
更新日期:2021-06-18
down
wechat
bug