Special Session: End-To-End Carbon Footprint Assessment and Modeling of Deep Learning

Published: 01 Jan 2024, Last Modified: 18 Jan 2025CODES+ISSS 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Large language models (LLMs) have been widely adopted across various applications. However, their intensive computation and energy use have raised concerns about environmental sustainability. Our open-source tool, LLMCarbon, is the first comprehensive model for estimating the carbon footprint of LLMs before training. This paper reviews the LLMCarbon model, details its hardware efficiency model, presents a case study on training carbon footprints, and discusses recent research inspired by LLMCarbon and future directions.
Loading