当前位置: X-MOL 学术IEEE Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
MECC: A Mobile Edge Collaborative Caching Framework Empowered by Deep Reinforcement Learning
IEEE NETWORK ( IF 6.8 ) Pub Date : 2021-08-20 , DOI: 10.1109/mnet.011.2000663
Siya Xu , Xin Liu , Shaoyong Guo , Xuesong Qiu , Luoming Meng

With the rapid development of smart city and 5G, user demand for Internet services has increased exponentially. Through collaborative content sharing, the storage limitation of a single edge server (ES) can be broken. However, when mobile users need to download the whole content through multiple regions, independently deciding the caching content for ESs in different regions may result in redundant caching. Furthermore, frequent switching of communication connection during user movement also causes retransmission delay. As a revolutionary approach in the artificial intelligence field, deep reinforcement learning (DRL) has earned great success in solving high-dimensional and network resource management related problems. Therefore, we integrate collaborative caching and DRL to build an intelligent edge caching framework, so as to realize collaborative caching between cloud and ESs. In this caching framework, a fed-erated-machine-learning-based user behavior prediction model is first designed to characterize the content preference and movement trajectory of mobile users. Next, to achieve efficient resource aggregation of ESs, a user-behavior-aware dynamic collaborative caching domain (DCCD) construction and management mechanism is devised to divide ESs into clusters, select cluster heads, and set the re-clustering rules. Then a DRL-based content caching and delivery algorithm is presented to decide the caching content of ESs in a DCCD from a global perspective and plan the transmission path for users, which reduces redundant content and transmission delay. Especially when a user request cannot be satisfied by the current DCCD, a cross-domain content delivery strategy is presented to allow ESs in other DCCDs to provide and forward content to the user, avoiding the traffic pressure and delay caused by requesting services from cloud. The simulation results show that the proposed collaborative caching framework can improve user satisfaction in terms of content hit rate and...

中文翻译:


MECC:深度强化学习支持的移动边缘协作缓存框架



随着智慧城市和5G的快速发展,用户对互联网服务的需求呈指数级增长。通过协作内容共享,可以打破单个边缘服务器(ES)的存储限制。然而,当移动用户需要通过多个区域下载全部内容时,独立决定不同区域ES的缓存内容可能会导致冗余缓存。此外,用户移动过程中通信连接的频繁切换也会造成重传延迟。作为人工智能领域的革命性方法,深度强化学习(DRL)在解决高维和网络资源管理相关问题方面取得了巨大成功。因此,我们将协同缓存和DRL相结合,构建智能边缘缓存框架,从而实现云端和ES之间的协同缓存。在此缓存框架中,首先设计了基于联邦机器学习的用户行为预测模型来表征移动用户的内容偏好和移动轨迹。接下来,为了实现ES的高效资源聚合,设计了一种用户行为感知的动态协作缓存域(DCCD)构建和管理机制,以将ES划分为簇、选择簇头并设置重新簇规则。然后提出一种基于DRL的内容缓存和传送算法,从全局角度决定DCCD中ES的缓存内容,并为用户规划传输路径,减少冗余内容和传输延迟。 特别是当当前的DCCD无法满足用户请求时,提出了跨域内容分发策略,允许其他DCCD中的ES向用户提供和转发内容,避免了向云端请求服务带来的流量压力和延迟。仿真结果表明,所提出的协作缓存框架可以在内容命中率和...方面提高用户满意度。
更新日期:2021-08-20
down
wechat
bug