Abstract: Mobile edge computing (MEC) is receiving growing attention. In MEC environments, application requests (i.e., a set of consecutive microservice requests) of users are first sent to nearby edge servers, which can significantly reduce the latency compared to sending requests to the cloud center. Therefore, it is vital to deploy suitable microservices on edge servers considering the resource and coverage limitations of edge servers and the movement of users. However, existing deployment approaches focus on offline scenarios, where a service vacuum may occur between two offline deployments due to the long deployment time. Online microservice deployment is thus becoming an urgent need to satisfy user requirements better. This paper proposes DDQN, a deep reinforcement learning approach to online microservice deployment. Specifically, DDQN leverages the Dueling DQN (Deep Q-Network) model to generate real-time microservice deployment plans. Experiments show that the proposed method can effectively improve the success rate of microservice deployment in online scenarios without losing timeliness.
Loading