Abstract: Highlights•Compute resources are underutilized in collaborative machine learning.•Underutilization leads to idle time and increases overall training time.•Our work Pipar uses pipeline parallelism to reduce idle time and accelerate training.•Pipar overlaps computation and communication.•Pipar reduces idle time by up to 64.1x and accelerates training by up to 34.6x.
Loading