Carbon Emissions and Large Neural Network Training

Authors

David Patterson (Google)
Joseph Gonzalez (UC Berkeley)
Quoc Le (Google)
Chen Liang (Google)
Lluis-Miquel Munguia (Google)
Daniel Rothchild (UC Berkeley)
David So (Google)
Maud Texier (Google)
Jeff Dean (Google)

Abstract

This comprehensive study analyzes the real carbon footprint of training large neural network models, taking into account multiple often-overlooked factors.

The research provides a detailed methodology for calculating CO2 emissions and demonstrates how the choice of data center location and timing can significantly impact the environmental cost of AI training.

The authors show that thoughtful choices about where and when to train models can reduce CO2 emissions by up to 100x compared to random choices.

Sources

Notice something missing or incorrect?
Suggest changes on GitHub