当前位置: X-MOL 学术IEEE Trans. Smart. Grid. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multiple Kernel Learning-Based Transfer Regression for Electric Load Forecasting
IEEE Transactions on Smart Grid ( IF 9.6 ) Pub Date : 2019-08-06 , DOI: 10.1109/tsg.2019.2933413
Di Wu , Boyu Wang , Doina Precup , Benoit Boulet

Electric load forecasting, especially short-term load forecasting (STLF), is becoming more and more important for power system operation. We propose to use multiple kernel learning (MKL) for residential electric load forecasting which provides more flexibility than traditional kernel methods. Computation time is an important issue for short-term forecasting, especially for energy scheduling. However, conventional MKL methods usually lead to complicated optimization problems. Another practical issue for this application is that there may be a very limited amount of data available to train a reliable forecasting model for a new house, while at the same time we may have historical data collected from other houses which can be leveraged to improve the prediction performance for the new house. In this paper, we propose a boosting-based framework for MKL regression to deal with the aforementioned issues for STLF. In particular, we first adopt boosting to learn an ensemble of multiple kernel regressors and then extend this framework to the context of transfer learning. Furthermore, we consider two different settings: homogeneous transfer learning and heterogeneous transfer learning. Experimental results on residential data sets demonstrate that forecasting error can be reduced by a large margin with the knowledge learned from other houses.

中文翻译:

基于多核学习的电力负荷预测回归回归

电力负荷预测,尤其是短期负荷预测(STLF),对于电力系统的运行变得越来越重要。我们建议使用多核学习(MKL)进行住宅用电负荷预测,这比传统的核方法更具灵活性。计算时间是短期预测(尤其是能源调度)的重要问题。但是,传统的MKL方法通常会导致复杂的优化问题。此应用程序的另一个实际问题是,可能没有足够的数据来训练新房的可靠预测模型,而与此同时,我们可能会从其他房屋收集历史数据,这些数据可用于改善房屋的使用状况。新房子的预测性能。在本文中,我们为MKL回归提出了一个基于Boosting的框架,以解决上述STLF问题。特别是,我们首先采用Boosting来学习多个内核回归器的集合,然后将该框架扩展到转移学习的上下文中。此外,我们考虑两种不同的设置:同质转移学习和异质转移学习。在住宅数据集上的实验结果表明,从其他房屋学到的知识可以大大减少预测误差。同质转移学习和异构转移学习。在住宅数据集上的实验结果表明,从其他房屋学到的知识可以大大减少预测误差。同质转移学习和异构转移学习。在住宅数据集上的实验结果表明,从其他房屋学到的知识可以大大减少预测误差。
更新日期:2020-04-22
down
wechat
bug