当前位置: X-MOL 学术Phys. Commun. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Vertical federated DNN training
Physical Communication ( IF 2.2 ) Pub Date : 2021-09-17 , DOI: 10.1016/j.phycom.2021.101465
Mingjun Dai 1 , Annan Xu 1 , Qingwen Huang 1 , Zhonghao Zhang 2 , Xiaohui Lin 1
Affiliation  

In the training process of distributed machine learning, the data possessed at distinct companies usually contain different features. Labels may even be lacked at certain companies. Therefore, one option is that multiple companies perform joint training in the form of federated learning (FL). Existing studies on FL confines to traditional machine learning (ML), including linear regression, logistic regression, and decision tree. In this work, vertical federated learning is extended from traditional ML to deep neural networks (DNN). A privacy-preserving DNN model training scheme based on homomorphic encryption is proposed for vertically segmented datasets. It is shown that the proposed federated DNN scheme can achieve similar precision as that of the traditional centralized DNN scheme.



中文翻译:

垂直联合 DNN 训练

在分布式机器学习的训练过程中,不同公司拥有的数据通常包含不同的特征。某些公司甚至可能没有标签。因此,一种选择是多个公司以联邦学习(FL)的形式进行联合训练。现有的 FL 研究仅限于传统机器学习 (ML),包括线性回归、逻辑回归和决策树。在这项工作中,垂直联邦学习从传统的 ML 扩展到深度神经网络 (DNN)。针对垂直分割的数据集,提出了一种基于同态加密的隐私保护DNN模型训练方案。结果表明,所提出的联合 DNN 方案可以达到与传统集中式 DNN 方案相似的精度。

更新日期:2021-10-01
down
wechat
bug