Abstract: Large language models (LLMs) have been widely adopted across various applications. However, their intensive computation and energy use have raised concerns about environmental sustainability. Our open-source tool, LLMCarbon, is the first comprehensive model for estimating the carbon footprint of LLMs before training. This paper reviews the LLMCarbon model, details its hardware efficiency model, presents a case study on training carbon footprints, and discusses recent research inspired by LLMCarbon and future directions.
Loading