当前位置:
X-MOL 学术
›
arXiv.cs.PF
›
论文详情
Our official English website, www.x-mol.net, welcomes your
feedback! (Note: you will need to create a separate account there.)
Phoebe: Reuse-Aware Online Caching with Reinforcement Learning for Emerging Storage Models
arXiv - CS - Performance Pub Date : 2020-11-13 , DOI: arxiv-2011.07160 Nan Wu, Pengcheng Li
arXiv - CS - Performance Pub Date : 2020-11-13 , DOI: arxiv-2011.07160 Nan Wu, Pengcheng Li
With data durability, high access speed, low power efficiency and byte
addressability, NVMe and SSD, which are acknowledged representatives of
emerging storage technologies, have been applied broadly in many areas.
However, one key issue with high-performance adoption of these technologies is
how to properly define intelligent cache layers such that the performance gap
between emerging technologies and main memory can be well bridged. To this end,
we propose Phoebe, a reuse-aware reinforcement learning framework for the
optimal online caching that is applicable for a wide range of emerging storage
models. By continuous interacting with the cache environment and the data
stream, Phoebe is capable to extract critical temporal data dependency and
relative positional information from a single trace, becoming ever smarter over
time. To reduce training overhead during online learning, we utilize periodical
training to amortize costs. Phoebe is evaluated on a set of Microsoft cloud
storage workloads. Experiment results show that Phoebe is able to close the gap
of cache miss rate from LRU and a state-of-the-art online learning based cache
policy to the Belady's optimal policy by 70.3% and 52.6%, respectively.
中文翻译:
Phoebe:针对新兴存储模型的具有强化学习的重用感知在线缓存
NVMe和SSD作为公认的新兴存储技术代表,凭借数据持久性、高访问速度、低功耗和字节寻址能力,在许多领域得到广泛应用。然而,高性能采用这些技术的一个关键问题是如何正确定义智能缓存层,以便能够很好地弥合新兴技术和主内存之间的性能差距。为此,我们提出了 Phoebe,这是一种可重用的强化学习框架,用于优化在线缓存,适用于各种新兴存储模型。通过与缓存环境和数据流的持续交互,Phoebe 能够从单个跟踪中提取关键的时间数据依赖性和相对位置信息,随着时间的推移变得越来越智能。为了减少在线学习期间的培训开销,我们利用定期培训来摊销成本。Phoebe 在一组 Microsoft 云存储工作负载上进行了评估。实验结果表明,Phoebe 能够将 LRU 和最先进的基于在线学习的缓存策略的缓存未命中率与 Belady 的最优策略的差距分别缩小 70.3% 和 52.6%。
更新日期:2020-11-17
中文翻译:
Phoebe:针对新兴存储模型的具有强化学习的重用感知在线缓存
NVMe和SSD作为公认的新兴存储技术代表,凭借数据持久性、高访问速度、低功耗和字节寻址能力,在许多领域得到广泛应用。然而,高性能采用这些技术的一个关键问题是如何正确定义智能缓存层,以便能够很好地弥合新兴技术和主内存之间的性能差距。为此,我们提出了 Phoebe,这是一种可重用的强化学习框架,用于优化在线缓存,适用于各种新兴存储模型。通过与缓存环境和数据流的持续交互,Phoebe 能够从单个跟踪中提取关键的时间数据依赖性和相对位置信息,随着时间的推移变得越来越智能。为了减少在线学习期间的培训开销,我们利用定期培训来摊销成本。Phoebe 在一组 Microsoft 云存储工作负载上进行了评估。实验结果表明,Phoebe 能够将 LRU 和最先进的基于在线学习的缓存策略的缓存未命中率与 Belady 的最优策略的差距分别缩小 70.3% 和 52.6%。