Carbon Literacy for Generative AI: Visualizing Training Emissions Through Human-Scale Equivalents

Published: 24 Sept 2025, Last Modified: 07 Nov 2025NeurIPS 2025 Workshop GenProCCEveryoneRevisionsBibTeXCC BY 4.0
Track: Creative demo
Keywords: carbon emissions, sustainability, GenAI
TL;DR: Carbon Literacy for Generative AI
Abstract: Training large language models (LLMs) consumes vast energy and produces sub-stantial carbon emissions, yet this impact remains largely invisible due to limited transparency. We compile reported and estimated training emissions (2018–2024) for 13 state-of-the-art models and reframe them through human-friendly comparisons, such as trees required for absorption, and per-capita footprints, via our interactive demo. Our findings highlight both the alarming scale of emissions and the lack of standardized reporting. We position this work as a contribution to Creative Practices, advancing public awareness and encouraging model reporting transparency of generative AI (GenAI). Our demo is available: https://neurips-co2-viz.vercel.app/
Submission Number: 27
Loading