当前位置: X-MOL 学术arXiv.cs.NI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Online Caching and Coding at the WiFi Edge: Gains and Tradeoffs
arXiv - CS - Networking and Internet Architecture Pub Date : 2020-01-21 , DOI: arxiv-2001.07334
Lalhruaizela Chhangte, Emanuele Viterbo, D Manjunath, and Nikhil Karamchandani

Video content delivery at the wireless edge continues to be challenged by insufficient bandwidth and highly dynamic user behavior which affects both effective throughput and latency. Caching at the network edge and coded transmissions have been found to improve user performance of video content delivery. The cache at the wireless edge stations (BSs, APs) and at the users' end devices can be populated by pre-caching content or by using online caching policies. In this paper, we propose a system where content is cached at the user of a WiFi network via online caching policies, and coded delivery is employed by the WiFi AP to deliver the requested content to the user population. The content of the cache at the user serves as side information for index coding. We also propose the LFU-Index cache replacement policy at the user that demonstrably improves index coding opportunities at the WiFi AP for the proposed system. Through an extensive simulation study, we determine the gains achieved by caching and index by coding. Next, we analyze the tradeoffs between them in terms of data transmitted, latency, and throughput for different content request behaviors from the users. We also show that the proposed cache replacement policy performs better than traditional cache replacement policies like LRU and LFU.

中文翻译:

WiFi 边缘的在线缓存和编码:收益和权衡

无线边缘的视频内容交付继续受到带宽不足和高度动态用户行为的挑战,这会影响有效吞吐量和延迟。已经发现在网络边缘缓存和编码传输可以提高视频内容交付的用户性能。无线边缘站(BS、AP)和用户终端设备的缓存可以通过预缓存内容或使用在线缓存策略来填充。在本文中,我们提出了一种系统,其中通过在线缓存策略将内容缓存在 WiFi 网络的用户处,并且 WiFi AP 采用编码传递将请求的内容传递给用户群。用户缓存的内容作为索引编码的辅助信息。我们还为用户提出了 LFU-Index 缓存替换策略,该策略显着改善了所提议系统在 WiFi AP 上的索引编码机会。通过广泛的模拟研究,我们确定了通过编码实现缓存和索引的收益。接下来,我们针对用户的不同内容请求行为,分析它们在传输数据、延迟和吞吐量方面的权衡。我们还表明,所提出的缓存替换策略比传统的缓存替换策略(如 LRU 和 LFU)表现更好。以及来自用户的不同内容请求行为的吞吐量。我们还表明,所提出的缓存替换策略比传统的缓存替换策略(如 LRU 和 LFU)表现更好。以及来自用户的不同内容请求行为的吞吐量。我们还表明,所提出的缓存替换策略比传统的缓存替换策略(如 LRU 和 LFU)表现更好。
更新日期:2020-01-22
down
wechat
bug