Affinity and Interference-Aware Service Deployment for Energy Efficiency in Cloud Data Centers: A Deep Reinforcement Learning Approach

Published: 2025, Last Modified: 23 Jan 2026COMPSAC 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Cloud computing has revolutionized data center management by providing scalable and efficient resources for processing and data management. However, deploying containerd-based services in data centers presents significant challenges: (1) Active servers that are underutilized result in high energy consumption, necessitating optimization for energy efficiency; (2) Affinity requirements between services and servers must be considered to ensure appropriate deployments; (3) Quality of Service (QoS) requirements must be met, particularly to avoid performance interference when multiple services are deployed on the same server. To address these challenges, we propose a novel algorithm, Affinity-Interference Energy Deployment (AIED), based on Deep Reinforcement Learning (DRL). This algorithm strategically consolidates services onto fewer servers to optimize energy efficiency while adhering to stringent QoS and affinity constraints. By employing a demand-supply model to quantify QoS requirements and formulating the deployment challenge as a Markov Decision Process (MDP), our algorithm dynamically adapts to fluctuating demands and resource availability. Extensive simulations demonstrate that AIED significantly outperforms existing baseline strategies, reducing energy consumption while ensuring robust compliance with both QoS and affinity constraints.
Loading