Elsevier

Computer Communications

Volume 166, 15 January 2021, Pages 208-215
Computer Communications

Edge computing assisted privacy-preserving data computation for IoT devices

https://doi.org/10.1016/j.comcom.2020.11.018Get rights and content

Abstract

Along with the ubiquitous deployment of IoT devices, requirements on sensing data computation and analysis increase rapidly. However, the traditional cloud-based architecture is no longer sustained the computation load from these tremendous IoT devices, which bring the paradox of delay tolerance and bandwidth insufficiency. Fortunately, the edge computing is emerged and incorporated with the IoT network. Meanwhile, new questions arises. When and how to select among edge computing servers, and also achieve a well balance between consumed energy, transmission delay and data privacy. In this paper, we consider the problem that how IoT devices allocate their computation loads among edge computing servers and their on-chip computation units, to balance energy efficiency and data privacy in physical layer. Firstly, the optimization function of IoT devices is derived which reflects the energy consumption, transmission delay and also privacy requirement; Secondly, the direct transmission scenario is analyzed, and optimal transmit power are derived with or without privacy factors; Thirdly, we extend the model to relay transmission scenario when edge computing servers are far away, and propose the relay selection algorithm for IoT devices; Finally, by extensive simulations, two main conclusions are verified: the energy consumption remains the same with data privacy protection, energy saved 54.9% on average using relay IoT devices compared to direct transmission case.

Introduction

Internet of Things (IoT) is an intelligent network which has the ability of generating and processing data autonomously by big data analysis, or artificial intelligence [1]. Moreover, the fifth generation (5G) cellular network will lay the groundwork for further connecting IoT devices worldwide. By effective utilization of spectrum bands and intelligent data analysis technologies, the 5G Intelligent Internet of Things (5G I-IoT) can be realized [2]. By 2020, the number of IoT devices in China is expected to reach 24 billion devices, nearly half of IoT devices around the world [3]. Obviously, the establishment of IoT network aims to creating a closed loop between devices and human eventually. Firstly, the ubiquitous IoT devices acquire huge amounts of raw data about the environment and status of equipments around us; Second, raw data should be transmitted to cloud servers or user equipment intact; Finally, uploading data can be processed for fulfilling user-centric applications. Moreover, data information must be transmitted bidirectional for that, IoT devices can upload monitoring data to users and receive commands from users either.

Along with the development of cloud computing, all data gathered from the edge of network can be uploaded and processed in cloud. With tremendous computation abilities, the cloud may achieve the data processing in relatively short period than local computation unit inside IoT devices. As the amount of IoT devices and also data produced by these devices soared in recent years, the centralized cloud computing structures becomes inefficient and overburdened. With the computation abilities increased, there exists a novel solution that, data are not necessarily uploaded to a central cloud, but processed in nearby servers. Hence, the edge computing technology emerged and relieved the data uploading requirements dramatically [4], [5], [6], [7]. Most of the data computation conducted in the edge of network bring two advantages: firstly, communication overhead and transmission delay [8] can be decreased; secondly, the risk on information leakage decreases, and thus data privacy is protected efficiently [9], [10], [11]. Nevertheless, more attention should be drawn on privacy of user-sensitive information collected by IoT devices. By maliciously attacked the sensitive information when IoT devices upload data to multi-access edge computing (MEC), the personal data statistics would be inferred by using machine learning methods [12].

In [13], the computation offloading problem is analyzed, which aimed to minimize energy consumption by selecting from local unit and MEC server individually. The main metrics for device’s computation selection are transmission delay and energy consumption. However, the privacy of transmitted data is ignored. In [14], multi-user and multi-edge node computation offloading problem is proposed, which aimed to solve the edge computing server selection problem among multiple users distributively. The Nash equilibrium strategies for best offloading mechanism is derived under both known channel state information (CSI) and unknown CSI scenarios. However, the main metric for performance evaluation is transmission delay, but data privacy is ignored. In [15], the task offloading problem in ultra-dense network is proposed, aiming at minimizing the processing delay while saving devices’ energy. By transforming into two sub-problems, an efficient offloading scheme is derived to reduce 20% duration and achieve 30% energy saving. In [16], a time average computation rate maximization (TACRM) algorithm is proposed, which aimed at joint allocation of spectrum and computation resources. Therefore, bandwidths, transmission power, time period, frequency of local CPU are considered as variables in proposed algorithms, without data privacy factor either.

In our work, we formulate the computing task allocation problem for IoT devices with edge computing servers, by trade-off between three metrics, that is delay tolerance, energy consumption, and also data privacy protections. For each IoT device, strategies are needed to decide that whether to finish its computation task locally or by edge computing server remotely, and also select among edge computing servers. The main contributions are as follows:

  • By incorporating privacy protection on sensing data uploading on physical layer, we proposed a novel and comprehensive computation task allocation model for IoT devices, and proposed algorithms which achieve well balance between privacy and energy consumption.

  • The computation task allocation are analyzed under direct transmission and relay transmission respectively. Based on the optimal transmission power, the optimal and suboptimal strategies are derived, which may facilitate longer survival and deployment for IoT devices.

  • Extensive simulations are conducted for verifying proposed strategies, and thus the efficiency and effectiveness of proposed strategies are proved.

The rest of this paper is organized as follows. In Section 2, we describe the system model and formulate the computation task allocation problem for IoT devices. In Section 3, the direct transmission scenario is analyzed, and we derived the optimal transmission strategy for data uploading to edge computing servers. Further, the relay selection issue is analyzed when the edge computing server is far away from IoT devices in Section 4. In Section 5, the simulation results are provided. Finally, we conclude this work in Section 6.

Section snippets

System model

In this section, we build the system model that consisted of multiple IoT devices and multiple edge computing servers in a given area, and formulate the computation task allocation problem.

Direct transmission scenario

In this section, we begin with the case that IoT devices transmit their computation load to edge computing server directly, without any help from other IoT devices. Firstly, we ignore the information leakage factor, and derive the optimal computing strategies for devices. As we derived device i’s consumed energy in Eq. (6), the simplified problem P2 can be described as: P2:minpi,TilocalTilocalκSiX(fit)2+TiedgepiSir(hij,pi)For a given edge computing server j,jJ and task Si, we derive the

Relay transmission scenario

Although we have derived IoT device’s strategies in direct transmission scenario, IoT devices also need to drop task frequently when long distance between devices and edge computing servers. Hence, relay transmission is a promising approach to minimize dropping rate and energy consumption. As we derived energy consumption using edge computing servers by Eqs. (16), (18), which are increasing functions referring to distance between devices and edge computing server. During this section, we

Simulation results and analysis

In this section, we will conduct extensive simulations to verify performance of our proposed mechanism. In our simulation setting, all IoT devices and edge computing servers are located in a given area of 100m×100m. Further, this area is divided into 3 sub-areas, the left 30m×100m is reserved for location of source IoT device, the middle 30m×100m is allocated for relay IoT devices, and the right part is allocated for edge computing servers. Further, we assume that all IoT devices are equipped

Conclusion

In this work, we analyze the computation task allocation problem for IoT devices with limited energy. Considering these constraints, such as energy consumption, transmission delay, and data privacy, we formulate task computation allocation as an optimization problem, and thus derive optimal strategies. In direct transmission scenario, we derive optimal transmission power for devices, and then derive optimal ratio of whole task for local computation unit. In relay transmission scenario, by

CRediT authorship contribution statement

Gaofei sun: Methodology, Software, Writing - original draft, Funding acquisition. Xiaoshuang Xing: Writing - review & editing, Validation, Investigation. Zhenjiang Qian: Supervision, Funding acquisition. Wei (Lisa) Li: Writing - review & editing.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

The authors would like to thank the support from the Natural Science Foundation of China 61702056; the Natural Science Foundation of Jiangsu Province in China under grant BK20191475; the Qing Lan Project of Jiangsu Province in China under grant 2019; Key Areas Common Technology Tendering Project of Jiangsu Province in 2017 (Research on Information Security Common Technology of Industrial Control System); Science and Technology Project of State Grid Corporation, China in 2019 (Research on

References (16)

There are more references available in the full text version of this article.

Cited by (4)

  • A survey of privacy-preserving offloading methods in mobile-edge computing

    2022, Journal of Network and Computer Applications
    Citation Excerpt :

    Then, considering that the security risks can be shared by the users and their neighbors, two MAB-based algorithms are proposed to minimize the total risk by selecting an appropriate offloading server. In sun et al. (2021), the reliability of an edge server is measured by its geographical distance to the user. The underlying rationale is that, when the user-to-server distance is large, more eavesdroppers may exist in the coverage of the server and hence the offloaded tasks are more likely to be wiretapped.

View full text