Toggle navigation
OpenReview
.net
Login
×
Go to
DBLP
homepage
Mist: Efficient Distributed Training of Large Language Models via Memory-Parallelism Co-Optimization
Zhanda Zhu
,
Christina Giannoula
,
Muralidhar Andoorveedu
,
Qidong Su
,
Karttikeya Mangalam
,
Bojian Zheng
,
Gennady Pekhimenko
Published: 01 Jan 2025, Last Modified: 23 Apr 2025
EuroSys 2025
Everyone
Revisions
BibTeX
CC BY-SA 4.0
Loading