TY - JOUR
T1 - Smart buildings energy consumption forecasting using adaptive evolutionary bagging extra tree learning models
AU - Neshat, Mehdi
AU - Thilakaratne, Menasha
AU - El-Abd, Mohammed
AU - Mirjalili, Seyedali
AU - Gandomi, Amir H.
AU - Boland, John
N1 - Publisher Copyright:
© 2025 The Authors
PY - 2025/7/11
Y1 - 2025/7/11
N2 - Smart buildings are gaining popularity because they have the capability to enhance energy efficiency, lower costs, improve security, and provide a more comfortable and convenient environment for building occupants. A considerable ratio of the global energy supply has been consumed in building sectors and plays a pivotal role in the future decarbonisation pathways. In order to manage energy consumption and improve energy efficiency in smart buildings, developing reliable and accurate energy demand forecasting is crucial and meaningful. However, extending an effective predictive model for the total energy use of appliances at the buildings’ level is challenging due to temporal oscillations and complex linear and non-linear patterns. This paper proposes three hybrid ensemble predictive models, incorporating Bagging, Stacking, and Voting mechanisms combined with a fast and effective evolutionary hyper-parameters tuner. The performance of the proposed energy forecasting model was evaluated using a hybrid dataset of meteorological parameters, energy use of appliances, temperature, humidity, and lighting energy consumption from different sections collected by 18 sensors in a building located in Stambruges, Mons in Belgium. In order to provide a comparative framework and investigate the efficiency of the proposed predictive model, 15 popular machine learning (ML) models, including two classic ML models, three Neural Networks (NN), a Decision Tree (DT), a Random Forest (RF), two Deep Learning (DL) and six Ensemble models, were compared. The prediction results indicate that the adaptive evolutionary bagging model surpassed other predictive models in both accuracy and learning error. Notably, it delivered accuracy gains of 12.6%, 13.7%, 12.9%, 27.04%, and 17.4% when compared to Extreme Gradient Boosting (XGB), Categorical Boosting (CatBoost), Gradient Boosting Machine (GBM), Light Gradient Boosting Machine (LGBM), and RF.
AB - Smart buildings are gaining popularity because they have the capability to enhance energy efficiency, lower costs, improve security, and provide a more comfortable and convenient environment for building occupants. A considerable ratio of the global energy supply has been consumed in building sectors and plays a pivotal role in the future decarbonisation pathways. In order to manage energy consumption and improve energy efficiency in smart buildings, developing reliable and accurate energy demand forecasting is crucial and meaningful. However, extending an effective predictive model for the total energy use of appliances at the buildings’ level is challenging due to temporal oscillations and complex linear and non-linear patterns. This paper proposes three hybrid ensemble predictive models, incorporating Bagging, Stacking, and Voting mechanisms combined with a fast and effective evolutionary hyper-parameters tuner. The performance of the proposed energy forecasting model was evaluated using a hybrid dataset of meteorological parameters, energy use of appliances, temperature, humidity, and lighting energy consumption from different sections collected by 18 sensors in a building located in Stambruges, Mons in Belgium. In order to provide a comparative framework and investigate the efficiency of the proposed predictive model, 15 popular machine learning (ML) models, including two classic ML models, three Neural Networks (NN), a Decision Tree (DT), a Random Forest (RF), two Deep Learning (DL) and six Ensemble models, were compared. The prediction results indicate that the adaptive evolutionary bagging model surpassed other predictive models in both accuracy and learning error. Notably, it delivered accuracy gains of 12.6%, 13.7%, 12.9%, 27.04%, and 17.4% when compared to Extreme Gradient Boosting (XGB), Categorical Boosting (CatBoost), Gradient Boosting Machine (GBM), Light Gradient Boosting Machine (LGBM), and RF.
KW - Deep learning
KW - Energy forecasting
KW - Ensemble learning
KW - Extra tree
KW - Hyper-parameter tuning
KW - Optimisation
KW - Smart building
UR - https://www.scopus.com/pages/publications/105010340677
U2 - 10.1016/j.energy.2025.137130
DO - 10.1016/j.energy.2025.137130
M3 - Article
AN - SCOPUS:105010340677
SN - 0360-5442
VL - 333
JO - Energy
JF - Energy
M1 - 137130
ER -