Smart buildings energy consumption forecasting using adaptive evolutionary bagging extra tree learning models

Mehdi Neshat, Menasha Thilakaratne, Mohammed El-Abd, Seyedali Mirjalili, Amir H. Gandomi, John Boland

Research output: Contribution to journalArticlepeer-review

Abstract

Smart buildings are gaining popularity because they have the capability to enhance energy efficiency, lower costs, improve security, and provide a more comfortable and convenient environment for building occupants. A considerable ratio of the global energy supply has been consumed in building sectors and plays a pivotal role in the future decarbonisation pathways. In order to manage energy consumption and improve energy efficiency in smart buildings, developing reliable and accurate energy demand forecasting is crucial and meaningful. However, extending an effective predictive model for the total energy use of appliances at the buildings’ level is challenging due to temporal oscillations and complex linear and non-linear patterns. This paper proposes three hybrid ensemble predictive models, incorporating Bagging, Stacking, and Voting mechanisms combined with a fast and effective evolutionary hyper-parameters tuner. The performance of the proposed energy forecasting model was evaluated using a hybrid dataset of meteorological parameters, energy use of appliances, temperature, humidity, and lighting energy consumption from different sections collected by 18 sensors in a building located in Stambruges, Mons in Belgium. In order to provide a comparative framework and investigate the efficiency of the proposed predictive model, 15 popular machine learning (ML) models, including two classic ML models, three Neural Networks (NN), a Decision Tree (DT), a Random Forest (RF), two Deep Learning (DL) and six Ensemble models, were compared. The prediction results indicate that the adaptive evolutionary bagging model surpassed other predictive models in both accuracy and learning error. Notably, it delivered accuracy gains of 12.6%, 13.7%, 12.9%, 27.04%, and 17.4% when compared to Extreme Gradient Boosting (XGB), Categorical Boosting (CatBoost), Gradient Boosting Machine (GBM), Light Gradient Boosting Machine (LGBM), and RF.

Original languageEnglish
Article number137130
JournalEnergy
Volume333
DOIs
StatePublished - 11 Jul 2025

Keywords

  • Deep learning
  • Energy forecasting
  • Ensemble learning
  • Extra tree
  • Hyper-parameter tuning
  • Optimisation
  • Smart building

Fingerprint

Dive into the research topics of 'Smart buildings energy consumption forecasting using adaptive evolutionary bagging extra tree learning models'. Together they form a unique fingerprint.

Cite this