Scalable Heterogeneous Scheduling Based Model Parallelism for Real-Time Inference of Large-Scale Deep Neural Networks

Published: 01 Jan 2024, Last Modified: 01 Oct 2024IEEE Trans. Emerg. Top. Comput. Intell. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Scaling up the capacity of deep neural networks (DNN) is one of the effective approaches to improve the model quality for several different DNN-based applications, making the DNN models continuously grow. To promote the execution efficiency of large and complex models, the devices are becoming increasingly heterogeneous with CPUs and domain-specific hardware accelerators. In many cases, the capacity of large-scale models is beyond the memory limit of a single accelerator. Recent work has shown that model parallelism, which aims to partition a DNN's computational graph on multiple devices, can not only address this problem while also provide significant performance improvements. In this work, we focus on optimizing model parallelism for timely inference of large-scale DNNs on heterogeneous processors. We transform the computation graphs of DNNs into directed acyclic graphs (DAGs) and propose to utilize heterogeneous scheduling methods to determine the model partition plan. Nevertheless, we have found that current efficient DAG scheduling methods have a lot of room for improvement to process large-scale DAGs and have high computation complexity. To this end, we propose a scalable DAG partition assisted scheduling method for heterogeneous processors to address these problems. Our approach takes the execution time of DNN models, high scalability, and memory constraints into consideration. We demonstrate the effectiveness of our approaches using both small- and large-scale DNN models. To the best of our knowledge, it is the first work that explores DAG scheduling and partitioning methods for model parallelism, and provides new avenues for accelerating large-scale DNN inference.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview