当前位置: X-MOL 学术arXiv.cs.DS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ReLU Regression with Massart Noise
arXiv - CS - Data Structures and Algorithms Pub Date : 2021-09-10 , DOI: arxiv-2109.04623
Ilias Diakonikolas, Jongho Park, Christos Tzamos

We study the fundamental problem of ReLU regression, where the goal is to fit Rectified Linear Units (ReLUs) to data. This supervised learning task is efficiently solvable in the realizable setting, but is known to be computationally hard with adversarial label noise. In this work, we focus on ReLU regression in the Massart noise model, a natural and well-studied semi-random noise model. In this model, the label of every point is generated according to a function in the class, but an adversary is allowed to change this value arbitrarily with some probability, which is {\em at most} $\eta < 1/2$. We develop an efficient algorithm that achieves exact parameter recovery in this model under mild anti-concentration assumptions on the underlying distribution. Such assumptions are necessary for exact recovery to be information-theoretically possible. We demonstrate that our algorithm significantly outperforms naive applications of $\ell_1$ and $\ell_2$ regression on both synthetic and real data.

中文翻译:

带有 Massart 噪声的 ReLU 回归

我们研究了 ReLU 回归的基本问题,其目标是将 Rectified Linear Units (ReLUs) 拟合到数据中。这种监督学习任务在可实现的设置中可以有效地解决,但众所周知,由于对抗性标签噪声在计算上很困难。在这项工作中,我们专注于 Massart 噪声模型中的 ReLU 回归,这是一种自然且经过充分研究的半随机噪声模型。在这个模型中,每个点的标签都是根据类中的一个函数生成的,但是允许对手以一定的概率任意改变这个值,{\em 至多} $\eta < 1/2$。我们开发了一种有效的算法,可以在对基础分布进行温和反集中假设的情况下,在该模型中实现精确的参数恢复。这些假设对于精确恢复是必要的——理论上是可能的。
更新日期:2021-09-13
down
wechat
bug