Skip to main content

Advertisement

Log in

Deep reinforcement learning based home energy management system with devices operational dependencies

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Advanced metering infrastructure and bilateral communication technologies facilitate the development of the home energy management system in the smart home. In this paper, we propose an energy management strategy for controllable loads based on reinforcement learning (RL). First, based on the mathematical model, the Markov decision process of different types of home energy resources (HERs) is formulated. Then, two RL algorithms, i.e. deep Q-learning and deep deterministic policy gradient are utilized. Based on the living habits of the residents, the dependency modes for HERs are proposed and are integrated into the reinforcement learning algorithms. Through the case studies, it is verified that the proposed method can schedule HERs properly to satisfy the established dependency modes. The difference between the achieved result and the optimal solution is relatively small.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Abbreviations

HEMS :

Home energy management system

HER :

Home energy resource

EV :

Electric vehicle

TCL :

Thermostatically controlled load

RTP :

Real time price

DCT :

Demand charge tariff

\(\Theta^{NC}\) :

Set of the non-interruptible appliance with constant power

\(\Theta^{IC}\) :

Set of the interruptible appliance with constant power

\(\Theta^{ESS}\) :

Set of the energy storage system

\(\Theta^{TCL}\) :

Set of the thermostatically controlled load

i :

Appliance index

t :

Time index

\(\overline{{E_{i}^{ESS} }}\) \(\underline{{E_{i}^{ESS} }}\) :

Upward and downward energy storage limits

\(E_{i}^{ESS,EXP}\) :

Expected energy storage state at the end of the day

\(\overline{{E_{i}^{EV} }}\) :

Maximum capacity of the EVs

\(E_{i,t\_arrive}^{EV}\) :

Energy storage state of EVs when arriving

\(P_{i}^{rate}\) :

Rated power of the constant power HERs

\(\overline{{P_{i}^{ESS,C} }}\) \(\overline{{P_{i}^{ESS,D} }}\) :

Charging and discharging power limits

\(P_{t}^{NS}\) :

Energy consumption of the non-schedulable load

\(P_{t}^{PV}\) :

Output of the PVs

R, C :

Thermal resistance and heat ratio of air

\(t_{i}^{s} ,t_{i}^{e}\) :

Starting and ending time of the a

\(t\_arrive\) :

Arrival time of each EVs

\(T^{set}\) :

Set indoor temperature

\(T^{min}\),\(T^{max}\) :

Minimum and maximum indoor temperature

\(T_{t}^{out}\) :

Outdoor temperature

\(WT_{i}\) :

Working time of the HERs

\(W_{i}\),\(D_{i}\) :

Electricity consumption per unit distance and daily driving distance

\(\eta^{C}\), \(\eta^{D}\) :

Charging and discharging efficiency

\(\lambda_{t}^{RTP}\) :

Real time electricity price

\(\lambda^{DCT}\) :

Demand charge tariff

\(\varphi\) :

Weight co-efficiency

\(P^{\prime}\) :

Historical peak power recorded in the current billing cycle

\(\delta\) :

Temperature band

\(f^{cl}\) :

Mean temperature of the outer surface of the clothed body

\(h^{c}\) :

Heat transfer coefficient

\(I^{cl}\) :

Thermal resistance of clothing

\(M\) :

Metabolic rate

\(PMV_{t}\) :

Predicted mean vote

\(PPD\) :

Predicted percentage of dissatisfied

\(P^{v}\) :

Vapor pressure in ambient air

\(rh\) :

Relative air humidity

\(T^{a}\) :

Indoor ambient air temperature

\(T^{mrt}\) :

Mean radiant temperature

\(T^{cl}\) :

Mean temperature of the outer surface of the clothed body

\(v^{ar}\) :

Relative air velocity

\(E_{i,t}^{ESS}\) :

Energy storage state

\(P_{i,t}^{ESS,C}\) :

Charging power of energy storage system

\(P_{i,t}^{ESS,D}\) :

Discharging power of energy storage system

\(P_{i,m,t}^{EV,CHA}\) :

Charging power of EVs

\(P_{i,m,t}^{EV,DIS}\) :

Discharging power of EVs

\(P_{t}^{TCL}\) :

Working power of TCLs

\(\widehat{{P_{t} }}\) :

Total energy consumption of the smart home

\(P_{i,t}^{HER}\) :

Energy consumption of HERs

\(T_{t}^{in}\) :

Indoor temperature

\(t*\) :

Time when the HERs are turned on

\(\delta_{i,t}\) :

Operation state of the HERs

References

  1. Zhang Y, Hajiesmaili MH, Cai S, Chen M, Zhu Q (2016) Peak-aware online economic dispatching for microgrids. IEEE Trans Smart Grid 9(1):323–335

    Article  Google Scholar 

  2. Mohsenian-Rad A-H, Leon-Garcia A (2010) Optimal residential load control with price prediction in real-time electricity pricing environments. IEEE Trans Smart Grid 1(2):120–133

    Article  Google Scholar 

  3. Tsui KM, Chan S-C (2012) Demand response optimization for smart home scheduling under real-time pricing. IEEE Trans Smart Grid 3(4):1812–1821

    Article  Google Scholar 

  4. Luo F, Kong W, Ranzi G, Dong ZY (2019) Optimal home energy management system with demand charge tariff and appliance operational dependencies. IEEE Trans Smart Grid 11(1):4–14

    Article  Google Scholar 

  5. Luo F, Ranzi G, Wan C, Xu Z, Dong ZY (2018) A multistage home energy management system with residential photovoltaic penetration. IEEE Trans Ind Inf 15(1):116–126

    Article  Google Scholar 

  6. Wu X, Hu X, Yin X, Moura SJ (2016) Stochastic optimal energy management of smart home with PEV energy storage. IEEE Trans Smart Grid 9(3):2065–2075

    Article  Google Scholar 

  7. Anvari-Moghaddam A, Monsef H, Rahimi-Kian A (2014) Optimal smart home energy management considering energy saving and a comfortable lifestyle. IEEE Trans Smart Grid 6(1):324–332

    Article  Google Scholar 

  8. Yu L, Jiang T, Zou Y (2017) Online energy management for a sustainable smart home with an HVAC load and random occupancy. IEEE Trans Smart Grid 10(2):1646–1659

    Article  Google Scholar 

  9. Hou X, Wang J, Huang T, Wang T, Wang P (2019) Smart home energy management optimization method considering energy storage and electric vehicle. IEEE Access 7:144010–144020

    Article  Google Scholar 

  10. Luo F, Ranzi G, Wang X, Dong ZY (2016) Service recommendation in smart grid: vision, technologies, and applications. In: Proceedings of 9th International Conference on Service Science (ICSS), pp 31–38

  11. Luo F, Ranzi G, Kong W, Dong ZY, Wang S, Zhao J (2017) Non-intrusive energy saving appliance recommender system for smart grid residential users. IET Gener Transm Distrib 11(7):1786–1793

    Article  Google Scholar 

  12. Luo F, Ranzi G, Wang X, Dong ZY (2017) Social information filtering-based electricity retail plan recommender system for smart grid end users. IEEE Trans Smart Grid 10(1):95–104

    Article  Google Scholar 

  13. Kawakami T, Yoshihisa T, Fujita N, Tsukamoto M (2013) A rule-based home energy management system using the Rete algorithm. In: Proceedings of IEEE 2nd Global Conference on Consumer Electronics (GCCE), pp 162–163

  14. Yoshihisa T, Fujita N, Tsukamoto M (2012) A rule generation method for electrical appliances management systems with home EoD. In: Proceedings of The 1st IEEE Global Conference on Consumer Electronics, pp 248–250

  15. Althaher SZ, Mutale J (2012) Management and control of residential energy through implementation of real time pricing and demand response. In: Proceedings of IEEE Power and Energy Society General Meeting, pp 1–7

  16. Ahmed MS, Shareef H, Mohamad A, Abd Ali J, Mutlag AH Rule base home energy management system considering residential demand response application. In: Proceedings of Applied Mechanics and Materials, pp 526–531

  17. Shareef H, Ahmed MS, Mohamed A, Al Hassan E (2018) Review on home energy management system considering demand responses, smart technologies, and intelligent controllers. IEEE Access 6:24498–24509

    Article  Google Scholar 

  18. Liu Y, Yuen C, Yu R, Zhang Y, Xie S (2015) Queuing-based energy consumption management for heterogeneous residential demands in smart grid. IEEE Trans Smart Grid 7(3):1650–1659

    Article  Google Scholar 

  19. Hong Y-Y, Lin J-K, Wu C-P, Chuang C-C (2012) Multi-objective air-conditioning control considering fuzzy parameters using immune clonal selection programming. IEEE Trans Smart Grid 3(4):1603–1610

    Article  Google Scholar 

  20. Kim Y-J (2020) A supervised-learning-based strategy for optimal demand response of an HVAC system in a multi-zone office building. IEEE Trans Smart Grid 11(5):4212–4226

    Article  Google Scholar 

  21. Silver D, Huang A, Maddison CJ, Guez A, Sifre L, Van Den Driessche G, Schrittwieser J, Antonoglou I, Panneershelvam V, Lanctot M (2016) Mastering the game of Go with deep neural networks and tree search. Nature 529(7587):484–489

    Article  Google Scholar 

  22. Perrusquía A, Yu W, Li X (2020) Multi-agent reinforcement learning for redundant robot control in task-space. Int J Mach Learn Cybern. https://doi.org/10.1007/s13042-020-01167-7

  23. Keyhanipour AH, Moshiri B, Rahgozar M, Oroumchian F, Ansari AA (2016) Integration of data fusion and reinforcement learning techniques for the rank-aggregation problem. Int J Mach Learn Cybern 7(6):1131–1145

    Article  Google Scholar 

  24. Mocanu E, Mocanu DC, Nguyen PH, Liotta A, Webber ME, Gibescu M, Slootweg JG (2018) On-line building energy optimization using deep reinforcement learning. IEEE Trans Smart Grid 10(4):3698–3708

    Article  Google Scholar 

  25. Xu X, Jia Y, Xu Y, Xu Z, Chai S, Lai CS (2020) A multi-agent reinforcement learning based data-driven method for home energy management. IEEE Trans Smart Grid 11(4):3201–3211

    Article  Google Scholar 

  26. Zhang X, Biagioni D, Cai M, Graf P, Rahman S (2020) An edge-cloud integrated solution for buildings demand response using reinforcement learning. IEEE Trans Smart Grid 12(1):420–431

    Article  Google Scholar 

  27. Yu L, Sun Y, Xu Z, Shen C, Yue D, Jiang T, Guan X (2020) Multi-agent deep reinforcement learning for HVAC control in commercial buildings. IEEE Trans Smart Grid 12(1):407–419

    Article  Google Scholar 

  28. Ye Y, Qiu D, Wu X, Strbac G, Ward J (2020) Model-free real-time autonomous control for a residential multi-energy system using deep reinforcement learning. IEEE Trans Smart Grid 11(4):3068–3082

    Article  Google Scholar 

  29. Gorostiza FS, Gonzalez-Longatt F (2020) Deep reinforcement learning-based controller for SOC management of multi-electrical energy storage system. IEEE Trans Smart Grid 11(6):5039–5050

    Article  Google Scholar 

  30. Luo F, Dong ZY, Meng K, Wen J, Wang H, Zhao J (2016) An operational planning framework for large-scale thermostatically controlled load dispatch. IEEE Trans Ind Inf 13(1):217–227

    Article  Google Scholar 

  31. Buratti C, Ricciardi P, Vergoni M (2013) HVAC systems testing and check: a simplified model to predict thermal comfort conditions in moderate environments. Appl Energy 104:117–127

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuechuan Tao.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Si, C., Tao, Y., Qiu, J. et al. Deep reinforcement learning based home energy management system with devices operational dependencies. Int. J. Mach. Learn. & Cyber. 12, 1687–1703 (2021). https://doi.org/10.1007/s13042-020-01266-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-020-01266-5

Keywords

Navigation