Adaptive Energy-Aware Algorithms for Minimizing Energy Consumption and SLA Violation in Cloud Computing

Published: 01 Jan 2018, Last Modified: 28 Jan 2025IEEE Access 2018EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In cloud computing, high energy consumption and service-level agreements (SLAs) violation are the challenging issues considering that the demand for computational power is growing rapidly, thereby requiring large-scale cloud data centers. Although, there are many existing energy-aware approaches focusing on minimizing energy consumption while ignoring the SLA violation at the time of a virtual machine (VM) selection from overloaded hosts. Also, they do not consider that the current network traffic causes performance degradation and thus may not really reduce SLA violation under a variety of workloads. In this context, this paper proposes three adaptive models, namely, gradient descent-based regression (Gdr), maximize correlation percentage (MCP), and bandwidth-aware selection policy (Bw), that can significantly minimize energy consumption and SLA violation. Energy-aware methods for overloaded host detection and VM selection from an overloaded host are necessary to improve the energy efficiency and SLA violation of a cloud data center after migrating all VM from underloaded host turn to idle host, which switch to energy-saving mode is also beneficial. Gdr and MCP are adaptive energy-aware algorithms based on the robust regression model, for overloaded host detection. A Bw dynamic VM selection policy selects VM according to the network traffic from the overloaded host under SLAs. Experimental results on the real workload traces show that the proposed algorithms reduce energy consumption while maintaining the required performance levels in a cloud data center using a CloudSim simulator to validate the proposed algorithms.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview