当前位置: X-MOL 学术arXiv.cs.NI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Private and Communication-Efficient Edge Learning: A Sparse Differential Gaussian-Masking Distributed SGD Approach
arXiv - CS - Networking and Internet Architecture Pub Date : 2020-01-12 , DOI: arxiv-2001.03836
Xin Zhang, Minghong Fang, Jia Liu, and Zhengyuan Zhu

With rise of machine learning (ML) and the proliferation of smart mobile devices, recent years have witnessed a surge of interest in performing ML in wireless edge networks. In this paper, we consider the problem of jointly improving data privacy and communication efficiency of distributed edge learning, both of which are critical performance metrics in wireless edge network computing. Toward this end, we propose a new decentralized stochastic gradient method with sparse differential Gaussian-masked stochastic gradients (SDM-DSGD) for non-convex distributed edge learning. Our main contributions are three-fold: i) We theoretically establish the privacy and communication efficiency performance guarantee of our SDM-DSGD method, which outperforms all existing works; ii) We show that SDM-DSGD improves the fundamental training-privacy trade-off by {\em two orders of magnitude} compared with the state-of-the-art. iii) We reveal theoretical insights and offer practical design guidelines for the interactions between privacy preservation and communication efficiency, two conflicting performance goals. We conduct extensive experiments with a variety of learning models on MNIST and CIFAR-10 datasets to verify our theoretical findings. Collectively, our results contribute to the theory and algorithm design for distributed edge learning.

中文翻译:

私有且通信高效的边缘学习:一种稀疏差分高斯掩蔽分布式 SGD 方法

随着机器学习 (ML) 的兴起和智能移动设备的普及,近年来人们对在无线边缘网络中执行 ML 的兴趣激增。在本文中,我们考虑了共同提高分布式边缘学习的数据隐私和通信效率的问题,这两者都是无线边缘网络计算中的关键性能指标。为此,我们提出了一种新的分散式随机梯度方法,该方法具有用于非凸分布式边缘学习的稀疏差分高斯掩码随机梯度(SDM-DSGD)。我们的主要贡献有三方面:i)我们在理论上建立了我们的 SDM-DSGD 方法的隐私和通信效率性能保证,该方法优于所有现有工作;ii) 我们表明,与最先进的技术相比,SDM-DSGD 将基本的训练隐私权衡提高了 {\em 两个数量级}。iii)我们揭示了理论见解并为隐私保护和通信效率之间的相互作用提供了实用的设计指南,这两个相互矛盾的性能目标。我们在 MNIST 和 CIFAR-10 数据集上对各种学习模型进行了广泛的实验,以验证我们的理论发现。总的来说,我们的结果有助于分布式边缘学习的理论和算法设计。我们在 MNIST 和 CIFAR-10 数据集上对各种学习模型进行了广泛的实验,以验证我们的理论发现。总的来说,我们的结果有助于分布式边缘学习的理论和算法设计。我们在 MNIST 和 CIFAR-10 数据集上对各种学习模型进行了广泛的实验,以验证我们的理论发现。总的来说,我们的结果有助于分布式边缘学习的理论和算法设计。
更新日期:2020-03-31
down
wechat
bug