Parallelized Training of Deep NN: Comparison of Current Concepts and FrameworksOpen Website

2018 (modified: 18 Oct 2021)DIDL@Middleware 2018Readers: Everyone
Abstract: Horizontal scalability is a major facilitator of recent advances in deep learning. Common deep learning frameworks offer different approaches for scaling the training process. We operationalize the execution of distributed training using Kubernetes and helm templates. This way we lay ground for a systematic comparison of deep learning frameworks. For two of them, TensorFlow and MXNet we examine their properties with regard to throughput, scalability and practical ease of use.
0 Replies

Loading