Sustainable AI Systems: Environmental Implications, Challenges and Opportunities

This paper provides a comprehensive analysis of the environmental impact of AI systems throughout their lifecycle, from development to deployment and maintenance. The authors examine various strategies for reducing the carbon footprint of AI, including efficient model architectures, green computing practices, and renewable energy usage. The research also presents concrete recommendations for developing and deploying AI systems in an environmentally responsible manner.

Sustainable AI: Environmental Implications, Challenges and Opportunities

This comprehensive survey examines the environmental impact of artificial intelligence throughout its lifecycle, from development to deployment and maintenance. The paper provides a systematic analysis of the challenges in making AI more sustainable, including hardware efficiency, algorithm design, and operational practices. The authors identify key opportunities for reducing AI’s environmental footprint and propose a research agenda for sustainable AI development.

The ML.ENERGY Benchmark: Toward Automated Inference Energy Measurement and Optimization

As Generative AI becomes increasingly integrated into real-world services, energy consumption has become a significant bottleneck—yet it remains under-measured and under-optimized in machine learning (ML) systems. This paper introduces the ML.ENERGY Benchmark and Leaderboard, an open-source suite and evaluation platform designed to measure and compare the inference energy use of AI models in realistic service environments. The authors present four core principles for effective energy benchmarking and illustrate their application within the tool. Results from the benchmark detail energy metrics for 40 popular model architectures across 6 tasks, showcase case studies on design decisions affecting energy use, and demonstrate that automatic optimizations can cut energy consumption by over 40% without sacrificing output quality. The ML.ENERGY Benchmark is extensible, making it a practical resource for both researchers and practitioners seeking to evaluate and minimize the energy footprint of their AI applications.