当前位置: X-MOL 学术Alex. Eng. J. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Energy management of intelligent building based on deep reinforced learning
Alexandria Engineering Journal ( IF 6.2 ) Pub Date : 2020-11-25 , DOI: 10.1016/j.aej.2020.11.005
Xiaoqing Huang , Dongliang Zhang , XiaoSong Zhang

In the context of the ubiquitous power Internet of Things (UPIoT), this paper attempts to make full use of distributed new energy, and rationalize the energy management strategy of households. Inspired by the energy management system (EMS) of intelligent buildings, the authors searched for the optimal control plan for energy based on deep reinforcement learning (DRL) algorithm. Under the overall architecture of the system, a distributed new energy generation system was modelled for consumers in intelligent buildings, including energy storage, household electrical loads, new energy vehicles, etc. Next, a Q-learning-based energy management model was established for intelligent buildings, and the corresponding constraints were set up. After that, the reward and penalty functions of the EMSs for households and the intelligent building were designed based on the daily economic dispatch (DED) model. Finally, the energy management strategy was optimized, creating the real-time optimization control process. The proposed energy management strategy was proved effective for intelligent buildings through simulations. The research results provide a reference for energy management in other microgrids.



中文翻译:

基于深度强化学习的智能建筑能源管理

在普适性的物联网(UPIoT)的背景下,本文试图充分利用分布式新能源,合理化家庭能源管理策略。受智能建筑能源管理系统(EMS)的启发,作者基于深度强化学习(DRL)算法搜索能源的最佳控制计划。在系统的总体架构下,为智能建筑中的消费者建模了分布式新能源发电系统,包括储能,家用电力负荷,新能源汽车等。接下来,建立了基于Q学习的能源管理模型智能建筑,并建立了相应的约束条件。之后,基于日常经济调度(DED)模型设计了家庭和智能建筑环境管理体系的奖惩功能。最后,优化了能源管理策略,创建了实时优化控制过程。通过仿真证明,提出的能源管理策略对智能建筑有效。研究结果为其他微电网的能源管理提供了参考。

更新日期:2020-11-25
down
wechat
bug