Abstract
Graph Neural Networks (GNNs) have achieved remarkable performance by taking advantage of graph data. The success of GNN models always depends on rich features and adjacent relationships. However, in practice, such data are usually isolated by different data owners (clients) and thus are likely to be Non-Independent and Identically Distributed (Non-IID). Meanwhile, considering the limited network status of data owners, hyper-parameters optimization for collaborative learning approaches is time-consuming in data isolation scenarios. To address these problems, we propose an Automated Separated-Federated Graph Neural Network (ASFGNN) learning paradigm. ASFGNN consists of two main components, i.e., the training of GNN and the tuning of hyper-parameters. Specifically, to solve the data Non-IID problem, we first propose a separated-federated GNN learning model, which decouples the training of GNN into two parts: the message passing part that is done by clients separately, and the loss computing part that is learnt by clients federally. To handle the time-consuming parameter tuning problem, we leverage Bayesian optimization technique to automatically tune the hyper-parameters of all the clients. We conduct experiments on benchmark datasets and the results demonstrate that ASFGNN significantly outperforms the naive federated GNN, in terms of both accuracy and parameter-tuning efficiency.
Similar content being viewed by others
References
Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Jozefowicz R, Kaiser L, Kudlur M, Levenberg J, Mané D, Monga R, Moore S, Murray D, Olah C, Schuster M, Shlens J, Steiner B, Sutskever I, Talwar K, Tucker P, Vanhoucke V, Vasudevan V, Viégas F, Vinyals O, Warden P, Wattenberg M, Wicke M, Yu Y, Zheng X (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. https://www.tensorflow.org/. Software available from tensorflow.org
Abril PS, Plant R (2007) A comprehensive survey on graph neural networks. Commun ACM 50(1), 36–44. https://doi.org/10.1145/1188913.1188915
Abuadbba S, Kim K, Kim M, Thapa C, Camtepe SA, Gao Y, Kim H, Nepal S (2020) Can we use split learning on 1d cnn models for privacy preserving training? arXiv:2003.12365
Aono Y, Hayashi T, Trieu Phong L, Wang L (2016) Scalable and secure logistic regression via homomorphic encryption. In: CODASPY. ACM, pp 142–144
Bojchevski A, Günnemann S (2017) Deep gaussian embedding of attributed graphs: Unsupervised inductive learning via ranking. arXiv:1707.03815
Chen C, Zhou J, Wang L, Wu X, Fang W, Tan J, Wang L, Ji X, Liu A, Wang H (2020) When homomorphic encryption marries secret sharing: Secure large-scale sparse logistic regression and applications in risk control. arXiv:2008.08753
Chen YW, Song Q, Hu X (2019) Techniques for automated machine learning
Finn C, Abbeel P, Levine S (2017) Model-agnostic meta-learning for fast adaptation of deep networks
Gao Y, Kim M, Abuadbba S, Kim Y, Thapa C, Kim K, Camtepe SA, Kim H, Nepal S (2020) End-to-end evaluation of federated learning and split learning for internet of things. arXiv:2003.13376
Gu Z, Huang H, Zhang J, Su D, Lamba A, Pendarakis D, Molloy I (2019) Securing input data of deep learning inference systems via partitioned enclave execution. CoRR arXiv:1807.00969
Gupta O, Raskar R (2018) Distributed learning of deep neural network over multiple agents. J Netw Comput Appl 116:1–8
Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: NeurIPS, pp 1024–1034
Kairouz P, McMahan H.B, Avent B, Bellet A, Bennis M, Bhagoji A.N, Bonawitz K, Charles Z, Cormode G, Cummings R, D’Oliveira RGL, Rouayheb SE, Evans D, Gardner J, Garrett Z, Gascón A, Ghazi B, Gibbons PB, Gruteser M, Harchaoui Z, He C, He L, Huo Z, Hutchinson B, Hsu J, Jaggi M, Javidi T, Joshi G, Khodak M, Konečný J, Korolova A, Koushanfar F, Koyejo S, Lepoint T, Liu Y, Mittal P, Mohri M, Nock R, Özgür A, Pagh R, Raykova M, Qi H, Ramage D, Raskar R, Song D, Song W, Stich SU, Sun Z, Suresh AT, Tramèr F, Vepakomma P, Wang J, Xiong L, Xu Z, Yang Q, Yu FX, Yu H, Zhao S (2019) Advances and open problems in federated learning
Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. CoRR arXiv:1609.02907
Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.02907
Lin J, Wong SKM (1990) A new directed divergence measure and its characterization. Int J Gen Syst 17(1)L73–81.
Lindauer M, Eggensperger K, Feurer M, Falkner S, Biedenkapp A, Hutter F (2017) Smac v3: Algorithm configuration in python. https://github.com/automl/SMAC3
Liu Z, Chen C, Li L, Zhou J, Li X, Song L, Qi Y (2018) Geniepath: Graph neural networks with adaptive receptive paths
Liu Z, Chen C, Yang X, Zhou J, Li X, Song L (2018) Heterogeneous graph neural networks for malicious account detection. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, CIKM ’18. Association for Computing Machinery, New York, pp 2077–2085. https://doi.org/10.1145/3269206.3272010
Lorenzo PR, Nalepa J, Ramos LS, Pastor JR (2017) Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’17. Association for Computing Machinery, New York, pp 1864–1871. https://doi.org/10.1145/3067695.3084211
McMahan HB, Moore E, Ramage D, y Arcas BA (2016) Federated learning of deep networks using model averaging. ArXiv:1602.05629
McMahan HB, Moore E, Ramage D, Hampson S, y Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: AISTATS
Mei G, Guo Z, Liu S, Pan L (2019) Sgnn: A graph neural network based federated learning approach by hiding structure. In: 2019 IEEE International Conference on Big Data (Big Data). IEEE, pp 2560–2568
Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14. Association for Computing Machinery, New York, pp 701–710. https://doi.org/10.1145/2623330.2623732
Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2009) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80
Shamir A (1979) How to share a secret. Commun ACM 22(11):612–613
Swersky K, Snoek J, Adams RP (2013) Multi-task bayesian optimization. In: Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 2, NIPS’13. Curran Associates Inc, Red Hook, pp 2004–2012
Thakkar O, Andrew G, McMahan HB (2019) Differentially private learning with adaptive clipping
Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2017) Graph attention networks
Wang T, Zhu JY, Torralba A, Efros AA (2018) Dataset distillation
Wang Y, Sun Y, Liu Z, Sarma SE, Bronstein MM, Solomon JM (2018) Dynamic graph cnn for learning on point clouds. arXiv:1801.07829
Wu J, Chen XY, Zhang H, Xiong LD, Lei H, Deng SH (2019) Hyperparameter optimization for machine learning models based on bayesian optimizationb. J Electron Sci Technol 17(1):26–40. https://doi.org/10.11989/JEST.1674-862X.80904120, http://www.sciencedirect.com/science/article/pii/S1674862X19300047
Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems. In: SIGKDD. ACM, pp 974–983
Yu T, Zhu H (2020) Hyper-parameter optimization: A review of algorithms and applications. arXiv:2003.05689
Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V (2018) Federated learning with non-iid data
Zhou J, Chen C, Zheng L, Zheng X, Wu B, Liu Z, Wang L (2020) Privacy-preserving graph neural network for node classification. arXiv:2005.11903
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This article belongs to the Topical Collection: Special Issue on Privacy-Preserving Computing
Guest Editors: Kaiping Xue, Zhe Liu, Haojin Zhu, Miao Pan and David S.L. Wei
Rights and permissions
About this article
Cite this article
Zheng, L., Zhou, J., Chen, C. et al. ASFGNN: Automated separated-federated graph neural network. Peer-to-Peer Netw. Appl. 14, 1692–1704 (2021). https://doi.org/10.1007/s12083-021-01074-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12083-021-01074-w