Skip to main content

Advertisement

Log in

A Computing Offloading Resource Allocation Scheme Using Deep Reinforcement Learning in Mobile Edge Computing Systems

  • Published:
Journal of Grid Computing Aims and scope Submit manuscript

Abstract

Aiming at the problems of increased latency and energy consumption and decreased service quality caused by current vehicle networks, this paper proposes a computing offloading resource allocation strategy based on deep reinforcement learning in Internet of Vehicles. Firstly, the system architecture for Internet of Vehicles is designed, calculation model and communication model of computing offloading strategy are constructed. Then, the resource allocation problem in offloading process is studied for real-time energy-aware offloading scheme in mobile edge computing. Besides, considering the battery capacity of vehicle users, the remaining energy rate is utilized to redefine weighting factors to sense energy consumption in real time. Finally, with the shortest delay and smallest computational cost as optimization goals, Q-learning is used to achieve the optimization of offloading strategy, that is, the optimal allocation of communication and computing resources, and the best system security. The simulation results show that the delay of the proposed algorithm is 0.442 s when the computational complexity is 9000 cycles/byte, and the performance of the delay is improved compared with the other three algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Ke, Z., Supeng, L., He, Y., et al.: Cooperative content caching in 5G networks with Mobile edge computing[J]. IEEE Wirel. Commun. 25(3), 80–87 (2018)

    Article  Google Scholar 

  2. Anta, H., Demo, N.N.: LL-MEC a SDN-based MEC platform[C]// the 23rd annual international conference. Snowbird: ACM Press, 483–485 (2017)

  3. Gill, S.S., Buyya, R.: Resource provisioning based scheduling framework for execution of heterogeneous and clustered workloads in clouds: from fundamental to autonomic offering[J]. J Grid Comput. 17(1), 385–417 (2018)

    Google Scholar 

  4. Shakarami, A., Ghobaei-Arani, M., Masdari, M., et al.: A survey on the computation offloading approaches in Mobile edge/cloud computing environment: a stochastic-based perspective[J]. J Grid Comput. 18(4), 1–33 (2020)

    Article  Google Scholar 

  5. Abdulrahman, A.: Deng Yong, Wei Guiyi, et al. collaborative security in vehicular cloud computing: a game theoretic view[J]. IEEE Netw. 32(3), 72–77 (2018)

    Article  Google Scholar 

  6. Ahmad, I., Noor, R. M., Ali, I.: et al. The role of vehicular cloud computing in road traffic management: a survey[C]// the 1st international conference on future intelligent vehicular technologies. Porto: Springer Press, 123–131 (2016)

  7. Abbasi, M., Pasand, E.M., Khosravi, M.R.: Workload allocation in IoT-fog-cloud architecture using a multi-objective genetic algorithm[J]. J Grid Comput. 18(1), 43–56 (2020)

    Article  Google Scholar 

  8. Lei, K., Fang, J., Zhang, Q., Lou, J., du, M., Huang, J., Wang, J., Xu, K.: Blockchain-based cache poisoning security protection and privacy-aware access control in NDN vehicular edge computing networks[J]. J Grid Comput. 18(6), 593–613 (2020)

    Article  Google Scholar 

  9. Wang, P., Chao, Y., Zheng, Z., et al.: Joint task assignment, transmission, and computing resource allocation in multilayer Mobile edge computing systems[J]. IEEE Internet Things J. 6(2), 2872–2884 (2019)

    Article  Google Scholar 

  10. Feng, W., Hong, X., Jie, X: Optimal Resource Allocation for Wireless Powered Mobile Edge Computing with Dynamic Task Arrivals[C]// ICC 2019–2019 IEEE international conference on communications (ICC). Shanghai: IEEE Press, 1–7, (2019)

  11. Qiaorong, L, Zho. S, Yilong, H: Computation Offloading Scheme to Improve QoE in Vehicular Networks with Mobile Edge Computing[C]// 2018 10th International Conference on Wireless Communications and Signal Processing (WCSP), 1–5, (2018)

  12. Wu, S., Weiwei,. X., Wenqing, C: et al. An Efficient Offloading Algorithm Based on Support Vector Machine for Mobile Edge Computing in Vehicular Networks[C]// 2018 10th international conference on wireless communications and signal processing(WCSP). Hangzhou: IEEE Press, 1–6, (2018)

  13. Zhenyu, Z, Pengju, L., Zheng, C: et al. Energy-efficient workload offloading and power control in vehicular edge computing[C]// 2018 IEEE wireless communications and networking conference workshops (WCNCW). Barcelona: IEEE Press, 191–196 (2018)

  14. Du Dai Yueyue, X., Maharjan, S., et al.: Artificial intelligence empowered edge computing and caching for internet of vehicles[J]. IEEE Wirel. Commun. 26(3), 12–18 (2019)

    Article  Google Scholar 

  15. Du Dai Yueyue, X., Maharjan, S., et al.: Joint load balancing and offloading in vehicular edge computing and networks[J]. IEEE Internet Things J. 6(3), 4377–4387 (2019)

    Article  Google Scholar 

  16. Kim Y., An N., Park J., et al. Mobility Support for Vehicular Cloud Radio-Access-Networks with Edge Computing[C]// 2018 IEEE 7th international conference on cloud networking (cloud net). Tokyo: IEEE Press, : 1–4, (2018)

  17. Yi, L., Yu, H., Xie, S., et al.: Deep reinforcement learning for offloading and resource allocation in vehicle edge computing and networks[J]. IEEE Trans. Veh. Technol. 68(11), 11158–11168 (2019)

    Article  Google Scholar 

  18. Ke, Z., Supeng, L., He, Y., et al.: Cooperative content caching in 5G networks with Mobile edge computing[J]. IEEE Wirel. Commun. 25(3), 80–87 (2018)

    Article  Google Scholar 

  19. Mustafa A. M., Abubakr O. M., Ahmadien O.: et al. Mobility Prediction for Efficient Resources Management in Vehicular Cloud Computing[C]// 2017 5th IEEE international conference on Mobile cloud computing, services, and engineering (Mobile cloud). San Francisco: IEEE Press, 53–59, (2017)

  20. Aissioui, A., Ksentini, A., Gueroui, A.M., Taleb, T.: On enabling 5G automotive systems using follow me edge-cloud concept[J]. IEEE Trans. Veh. Technol. 67(6), 5302–5316 (2018)

    Article  Google Scholar 

  21. Hu, L., Yuanwen, T., Yang, J., et al.: Ready player one: UAV-clustering-based multi-task offloading for vehicular VR/AR gaming[J]. IEEE Netw. 33(3), 42–48 (2019)

    Article  Google Scholar 

  22. Long, Z., Zhen, Z., Wu, Q., et al.: Energy-aware dynamic resource allocation in UAV assisted Mobile edge computing over social internet of vehicles[J]. IEEE Access. 6, 56700–56715 (2018)

    Article  Google Scholar 

  23. Ravi, A., Peddoju, S.K.: Handoff strategy for improving energy efficiency and cloud service availability for mobile devices[J]. Wirel. Pers. Commun. 81(1), 101–132 (2015)

    Article  Google Scholar 

  24. Jiao, Z, Fengyuan. R, Lin C: Delay guaranteed live migration of virtual machines[C]// IEEE INFOCOM 2014-IEEE conference on computer communications. Toronto: IEEE Press, 574–582, (2014)

  25. Joshi, G., Vig, R., Singh, S.: DCA-based unimodal feature-level fusion of orthogonal moments for Indian sign language dataset[J]. IET Comput. Vis. 12(5), 570–577 (2018)

    Article  Google Scholar 

  26. Abbas, N.: Zhang Yan, Taherkordi a., et al. Mobile edge computing: a survey[J]. IEEE Internet Things J. 5(1), 450–465 (2017)

    Article  Google Scholar 

  27. Quan, Y., Haibo, Z., Jinglin, L., et al.: Toward efficient content delivery for automated driving services: An edge computing solution[J]. IEEE Netw. 32(1), 80–86 (2018)

    Article  Google Scholar 

  28. Peng X B , Abbeel P , Levine S: et al. DeepMimic: Example-Guided Deep Reinforcement Learning of Physics-Based Character Skills[J]. ACM Transactions on Graphics, 2018, 37(4CD):143.1–143.14

  29. He, Y., Zhao, N., Yin, H.: Integrated networking, caching, and computing for connected vehicles: a deep reinforcement learning approach[J]. IEEE Trans. Veh. Technol. 67(1), 44–55 (2018)

    Article  Google Scholar 

  30. Xiong R , Cao J , Yu Q: Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle[J]. Applied Energy, 2018, 211(FEB.1):538–548

Download references

Acknowledgments

This work was supported by 2019 Domestic Visiting Training Project for Outstanding Young Backbone Teachers in Colleges and Universities in Anhui Province (No.gxgnfx2019050) and the Non-financial Research Projects of Suzhou University Anhui Province (No. 2020xhx094).

Data Availability Statement

The data included in this paper are available without any restriction.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuezhu Li.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, X. A Computing Offloading Resource Allocation Scheme Using Deep Reinforcement Learning in Mobile Edge Computing Systems. J Grid Computing 19, 35 (2021). https://doi.org/10.1007/s10723-021-09568-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10723-021-09568-w

Keywords

Navigation