当前位置: X-MOL 学术IEEE Trans. Parallel Distrib. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Resource Management for Power-Constrained HEVC Transcoding Using Reinforcement Learning
IEEE Transactions on Parallel and Distributed Systems ( IF 5.3 ) Pub Date : 2020-12-01 , DOI: 10.1109/tpds.2020.3004735
Luis Costero , Arman Iranfar , Marina Zapater , Francisco D. Igual , Katzalin Olcoz , David Atienza

The advent of online video streaming applications and services along with the users’ demand for high-quality contents require High Efficiency Video Coding (HEVC), which provides higher video quality and more compression at the cost of increased complexity. On one hand, HEVC exposes a set of dynamically tunable parameters to provide trade-offs among Quality-of-Service (QoS), performance, and power consumption of multi-core servers on the video providers’ data center. On the other hand, resource management of modern multi-core servers is in charge of adapting system-level parameters, such as operating frequency and multithreading, to deal with concurrent applications and their requirements. Therefore, efficient multi-user HEVC streaming necessitates joint adaptation of application- and system-level parameters. Nonetheless, dealing with such a large and dynamic design space is challenging and difficult to address through conventional resource management strategies. Thus, in this work, we develop a multi-agent Reinforcement Learning framework to jointly adjust application- and system-level parameters at runtime to satisfy the QoS of multi-user HEVC streaming in power-constrained servers. In particular, the design space, composed of all design parameters, is split into smaller independent sub-spaces. Each design sub-space is assigned to a particular agent so that it can explore it faster, yet accurately. The benefits of our approach are revealed in terms of adaptability and quality (with up to to $4\times$4× improvements in terms of QoS when compared to a static resource management scheme), and learning time ($6 \times$6× faster than an equivalent mono-agent implementation). Finally, we show that the power-capping techniques formulated outperform the hardware-based power capping with respect to quality.

中文翻译:

使用强化学习进行功率受限 HEVC 转码的资源管理

在线视频流应用和服务的出现以及用户对高质量内容的需求需要高效视频编码 (HEVC),它以增加的复杂性为代价提供更高的视频质量和更多的压缩。一方面,HEVC 公开了一组动态可调参数,以在视频提供商数据中心的多核服务器的服务质量 (QoS)、性能和功耗之间进行权衡。另一方面,现代多核服务器的资源管理负责调整系统级参数,例如运行频率和多线程,以处理并发应用程序及其需求。因此,高效的多用户 HEVC 流需要应用程序级和系统级参数的联合适应。尽管如此,处理如此庞大且动态的设计空间具有挑战性,并且难以通过传统的资源管理策略来解决。因此,在这项工作中,我们开发了一个多智能体强化学习框架,以在运行时联合调整应用程序和系统级参数,以满足功率受限服务器中多用户 HEVC 流的 QoS。特别是,由所有设计参数组成的设计空间被分成更小的独立子空间。每个设计子空间都分配给一个特定的代理,以便它可以更快、更准确地探索它。我们方法的好处体现在适应性和质量方面(最多可达 我们开发了一个多代理强化学习框架,在运行时联合调整应用程序和系统级参数,以满足功率受限服务器中多用户 HEVC 流的 QoS。特别是,由所有设计参数组成的设计空间被分成更小的独立子空间。每个设计子空间都分配给一个特定的代理,以便它可以更快、更准确地探索它。我们方法的好处体现在适应性和质量方面(最多可达 我们开发了一个多代理强化学习框架,在运行时联合调整应用程序和系统级参数,以满足功率受限服务器中多用户 HEVC 流的 QoS。特别是,由所有设计参数组成的设计空间被分成更小的独立子空间。每个设计子空间都分配给一个特定的代理,以便它可以更快、更准确地探索它。我们方法的好处体现在适应性和质量方面(最多可达 每个设计子空间都分配给一个特定的代理,以便它可以更快、更准确地探索它。我们方法的好处体现在适应性和质量方面(最多可达 每个设计子空间都分配给一个特定的代理,以便它可以更快、更准确地探索它。我们方法的好处体现在适应性和质量方面(最多可达$4\times$4× 与静态资源管理方案相比,QoS 方面的改进)和学习时间($6 \times$6×比等效的单代理实现更快)。最后,我们展示了制定的功率封顶技术在质量方面优于基于硬件的功率封顶。
更新日期:2020-12-01
down
wechat
bug