当前位置: X-MOL 学术arXiv.cs.CV › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Breaking the Chain of Gradient Leakage in Vision Transformers
arXiv - CS - Computer Vision and Pattern Recognition Pub Date : 2022-05-25 , DOI: arxiv-2205.12551
Yahui Liu, Bin Ren, Yue Song, Wei Bi, Nicu Sebe, Wei Wang

User privacy is of great concern in Federated Learning, while Vision Transformers (ViTs) have been revealed to be vulnerable to gradient-based inversion attacks. We show that the learned low-dimensional spatial prior in position embeddings (PEs) accelerates the training of ViTs. As a side effect, it makes the ViTs tend to be position sensitive and at high risk of privacy leakage. We observe that enhancing the position-insensitive property of a ViT model is a promising way to protect data privacy against these gradient attacks. However, simply removing the PEs may not only harm the convergence and accuracy of ViTs but also places the model at more severe privacy risk. To deal with the aforementioned contradiction, we propose a simple yet efficient Masked Jigsaw Puzzle (MJP) method to break the chain of gradient leakage in ViTs. MJP can be easily plugged into existing ViTs and their derived variants. Extensive experiments demonstrate that our proposed MJP method not only boosts the performance on large-scale datasets (i.e., ImageNet-1K), but can also improve the privacy preservation capacity in the typical gradient attacks by a large margin. Our code is available at: https://github.com/yhlleo/MJP.

中文翻译:

打破视觉变压器中的梯度泄漏链

用户隐私是联邦学习中非常关注的问题,而视觉转换器 (ViT) 已被证明容易受到基于梯度的反转攻击。我们表明,位置嵌入(PE)中学习的低维空间先验加速了 ViT 的训练。作为副作用,它使 ViT 往往对位置敏感,并且有很高的隐私泄露风险。我们观察到,增强 ViT 模型的位置不敏感特性是保护数据隐私免受这些梯度攻击的一种很有前途的方法。然而,简单地移除 PE 可能不仅会损害 ViT 的收敛性和准确性,还会使模型面临更严重的隐私风险。为了解决上述矛盾,我们提出了一种简单而有效的蒙面拼图(MJP)方法来打破 ViT 中的梯度泄漏链。MJP 可以轻松插入现有的 ViT 及其衍生变体。大量实验表明,我们提出的 MJP 方法不仅提高了在大规模数据集(即 ImageNet-1K)上的性能,而且还可以大幅度提高典型梯度攻击中的隐私保护能力。我们的代码位于:https://github.com/yhlleo/MJP。
更新日期:2022-05-26
down
wechat
bug