Awesome Green AI

A curated list of awesome Green AI resources and tools to reduce the environmental impacts of using and deploying AI.
https://github.com/samuelrince/awesome-green-ai

Category: Sustainable Development
Sub Category: Curated Lists

Keywords

ai awesome-list climate deep-learning green-ai green-software machine-learning sustainability sustainable-ai

Last synced: about 5 hours ago
JSON representation

Repository metadata

A curated list of awesome Green AI resources and tools to assess and reduce the environmental impacts of using and deploying AI.

README.md

Awesome Green AI 🤖🌱

A curated list of awesome Green AI resources and tools to reduce the environmental impacts of using and deploying AI.

In 2020, Information and Communications Technology (ICT) sector carbon footprint was estimated to be between 2.1-3.9% of total global greenhouse gas emissions. The ICT sector continues to grow and now dominates other industries. It is estimated that the carbon footprint will double to 6-8% by 2025. For ICT sector to remain compliant with the Paris Agreement, the industry must reduce by 45% its GHG emissions from 2020 to 2030 and reach net zero by 2050 (Freitag et al., 2021).

AI is one of the fastest growing sectors, disrupting many other industries (AI Market Size Report, 2022). It therefore has an important role to play in reducing carbon footprint. The impacts of ICT, and therefore AI, are not limited to GHG emissions and electricity consumption. We need to take into account all major impacts (abiotic resource depletion, primary energy consumption, water usage, etc.) using Life Cycle Assessment (LCA) (Arushanyan et al., 2013).

AI sobriety not only means optimizing energy consumption and reducing impacts, but also includes studies on indirect impacts and rebound effects that can negate all efforts to reduce the environmental footprint (Willenbacher et al. 2021). It is therefore imperative to consider the use of AI before launching a project in order to avoid indirect impacts and rebound effects later on.

All contributions are welcome. Add links through pull requests or create an issue to start a discussion.

đź›  Tools

Code-Based Tools

Tools to measure and compute environmental impacts of AI.

  • CodeCarbon – Track emissions from Compute and recommend ways to reduce their impact on the environment. Linux Mac Win GPU CLI
  • carbontracker – Track and predict the energy consumption and carbon footprint of training deep learning models. Linux GPU
  • Zeus – A framework for deep learning energy measurement and optimization. Linux GPU
  • Eco2AI – A python library which accumulates statistics about power consumption and CO2 emission during running code. Linux GPU
  • EcoLogits – Estimates the energy consumption and environmental footprint of LLM inference through APIs. Linux Mac Win GPU
  • Tracarbon – Tracks your device's energy consumption and calculates your carbon emissions using your location. Linux Mac GPU
  • AIPowerMeter – Easily monitor energy usage of machine learning programs. Linux GPU
  • carbonai – Python package to monitor the power consumption of any algorithm. Linux Mac Win GPU
  • experiment-impact-tracker – A simple drop-in method to track energy usage, carbon emissions, and compute utilization of your system. Linux GPU
  • GATorch – An Energy-Aware PyTorch Extension. Linux GPU
  • GPU Meter – Power Consumption Meter for NVIDIA GPUs. Linux GPU
  • PyJoules – A Python library to capture the energy consumption of code snippets. Linux GPU

Monitoring Tools

Tools to monitor power consumption and environmental impacts.

  • Scaphandre – A metrology agent dedicated to electrical power consumption metrics. Linux Win Docker k8s
  • CodeCarbon – Track emissions from Compute and recommend ways to reduce their impact on the environment. Linux Mac Win GPU CLI
  • PowerJoular – Monitor power consumption of multiple platforms and processes. Linux Raspberry GPU CLI
  • ALUMET – A modular and efficient software measurement tool. Linux GPU CLI
  • cardamon – A tool for measuring the power consumption and carbon footprint of your software. Linux Mac Win
  • Boagent – Local API and monitoring agent focussed on environmental impacts of the host. Linux
  • Powerletrics – PowerLetrics is a framework designed to monitor and analyze power consumption metrics at the process level on Linux. Linux
  • vJoule – A tool to estimate the energy consumption of your processes. Linux GPU CLI
  • jupyter-power-usage – Jupyter extension to display CPU and GPU power usage and carbon emissions. Linux GPU

Optimization Tools

Tools to optimize energy consumption or environmental impacts.

  • Zeus – A framework for deep learning energy measurement and optimization. Linux GPU
  • GEOPM – A framework to enable efficient power management and performance optimizations. GPU k8s

Calculation Tools

Tools to estimate environmental impacts of algorithms, models and compute resources.

Generic tools:

  • Boaviztapi - Multi-criteria impacts of compute resources taking into account manufacturing and usage.
  • Datavizta - Compute resources data explorer not limited to AI.
  • EcoDiag - Compute carbon footprint of IT resources taking into account manufactuing and usage (🇫🇷 only).
  • AI Carbon - Estimate your AI model's carbon footprint.
  • GenAI Carbon Footprint - A tool to estimate energy use (kWh) and carbon emissions (gCO2eq) from LLM usage.

Leaderboards

📚 Papers

  • Energy and Policy Considerations for Deep Learning in NLP - Strubell et al. (2019)
  • Quantifying the Carbon Emissions of Machine Learning - Lacoste et al. (2019)
  • Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models - Anthony et al. (2020)
  • The carbon impact of artificial intelligence. - Payal Dhar (2020)
  • Green AI - Schwartz et al. (2020)
  • Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning - Henderson et al. (2020)
  • GPU Lifetimes on Titan Supercomputer: Survival Analysis and Reliability - Ostrouchov et al. (2020)
  • The Energy and Carbon Footprint of Training End-to-End Speech Recognizers - Parcollet et al. (2021)
  • Carbon Emissions and Large Neural Network Training - Patterson, et al. (2021)
  • Green Algorithms: Quantifying the Carbon Footprint of Computation - Lannelongue et al. (2021)
  • Aligning artificial intelligence with climate change mitigation - Kaack et al. (2021)
  • A Practical Guide to Quantifying Carbon Emissions for Machine Learning researchers and practitioners - Ligozat et al. (2021)
  • Unraveling the Hidden Environmental Impacts of AI Solutions for Environment Life Cycle Assessment of AI Solutions - Ligozat et al. (2022)
  • Measuring the Carbon Intensity of AI in Cloud Instances - Dodge et al. (2022)
  • Green AI: do deep learning frameworks have different costs? - Georgiou et al. (2022)
  • Estimating the Carbon Footprint of BLOOM a 176B Parameter Language Model - Luccioni et al. (2022)
  • Bridging Fairness and Environmental Sustainability in Natural Language Processing - Hessenthaler et al. (2022)
  • Eco2AI: carbon emissions tracking of machine learning models as the first step towards sustainable AI - Budennyy et al. (2022)
  • Environmental assessment of projects involving AI methods - Lefèvre et al. (2022)
  • Sustainable AI: Environmental Implications, Challenges and Opportunities - Wu et al. (2022)
  • The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink - Patterson et al. (2022)
  • Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning - Henderson et al. (2022)
  • Towards Sustainable Artificial Intelligence: An Overview of Environmental Protection Uses and Issues - Pachot et al. (2022)
  • Method and evaluations of the effective gain of artificial intelligence models for reducing CO2 emissions - DelanoĂ« et al. (2023)
  • Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models - Li et al. (2023)
  • Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training - You et al. (2023)
  • Trends in AI inference energy consumption: Beyond the performance-vs-parameter laws of deep learning Desislavov et al. (2023)
  • Chasing Low-Carbon Electricity for Practical and Sustainable DNN Training - Yang et al. (2023)
  • Toward Sustainable HPC: Carbon Footprint Estimation and Environmental Implications of HPC Systems - Li et al. (2023)
  • Reducing the Carbon Impact of Generative AI Inference (today and in 2035) - Chien et al. (2023)
  • LLMCarbon: Modeling the End-To-End Carbon Footprint of Large Language Models - Faiz et al. (2023)
  • The growing energy footprint of artificial intelligence - De Vries (2023)
  • Exploring the Carbon Footprint of Hugging Face's ML Models: A Repository Mining Study - Castano et al. (2023)
  • Exploding AI Power Use: an Opportunity to Rethink Grid Planning and Management - Lin et al. (2023)
  • Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models - Li et al. (2023)
  • Power Hungry Processing: Watts Driving the Cost of AI Deployment? - Luccioni et al. (2023)
  • Perseus: Removing Energy Bloat from Large Model Training - Chung et al. (2023)
  • Timeshifting strategies for carbon-efficient long-running large language model training - Jagannadharao et al. (2023)
  • From Words to Watts: Benchmarking the Energy Costs of Large Language Model Inference - Samsi et al. (2023)
  • Estimating the environmental impact of Generative-AI services using an LCA-based methodology - Berthelot et al. (2024)
  • Towards Greener LLMs: Bringing Energy-Efficiency to the Forefront of LLM Inference - Stojkovic et al. (2024)
  • Green AI: Exploring Carbon Footprints, Mitigation Strategies, and Trade Offs in Large Language Model Training - Liu et al. (2024)
  • Engineering Carbon Emission-aware Machine Learning Pipelines - Humsom et al. (2024)
  • A simplified machine learning product carbon footprint evaluation tool - Lang et al. (2024)
  • Measuring and Improving the Energy Efficiency of Large Language Models Inference - Argerich et al. (2024)
  • Beyond Efficiency: Scaling AI Sustainably - Wu et al. (2024)
  • The Price of Prompting: Profiling Energy Use in Large Language Models Inference - Huson et al. (2024)
  • Offline Energy-Optimal LLM Serving: Workload-Based Energy Models for LLM Inference on Heterogeneous Systems - Wilkins et al. (2024)
  • MLCA: a tool for Machine Learning Life Cycle Assessment - Morand et al. (2024)
  • Hype, Sustainability, and the Price of the Bigger-is-Better Paradigm in AI - Varoquaux et al. (2024)
  • Addition is All You Need for Energy-efficient Language Models - Luo et al. (2024)
  • E-waste challenges of generative artificial intelligence - Wang et al. (2024)
  • Green My LLM: Studying the key factors affecting the energy consumption of code assistants - Coignion et al. (2024)
  • LLM-Inference-Bench: Inference Benchmarking of Large Language Models on AI Accelerators - Chitty-Venkata et al. (2024)
  • A Beginner's Guide to Power and Energy Measurement and Estimation for Computing and Machine Learning - Jagannadharao et al. (2024)
  • From Efficiency Gains to Rebound Effects: The Problem of Jevons' Paradox in AI's Polarized Environmental Debate - Luccioni et al. (2025)
  • Understanding the environmental impact of generative AI services - Berthelot et al. (2025)
  • EcoServe: Designing Carbon-Aware AI Inference Systems - Li et al. (2025)
  • Towards Sustainable NLP: Insights from Benchmarking Inference Energy in Large Language Models - Poddar et al. (2025)
  • Unveiling Environmental Impacts of Large Language Model Serving: A Functional Unit View - Wu et al. (2025)
  • Beyond Test-Time Compute Strategies: Advocating Energy-per-Token in LLM Inference - Wilhelm et al. (2025)
  • Frugal AI: Introduction, Concepts, Development and Open Questions - Arga et al. (2025)
  • Energy Considerations of Large Language Model Inference and Efficiency Optimizations - Fernandez et al. (2025)
  • The ML.ENERGY Benchmark: Toward Automated Inference Energy Measurement and Optimization - Chung et al. (2025)
  • How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference - Jegham et al. (2025)
  • Breaking the ICE: Exploring promises and challenges of benchmarks for Inference Carbon & Energy estimation for LLMs - Sikand et al. (2025)
  • Not All Water Consumption Is Equal: A Water Stress Weighted Metric for Sustainable Computing - Wu et al. (2025)
  • Measuring the environmental impact of delivering AI at Google Scale - Elsworth et al. (2025)
  • More than Carbon: Cradle-to-Grave environmental impacts of GenAI training on the Nvidia A100 GPU - Falk et al. (2025) [supplemental material]
  • Video Killed the Energy Budget: Characterizing the Latency and Power Regimes of Open Text-to-Video Models - Delavande et al. (2025)
  • Ground-Truthing AI Energy Consumption: Validating CodeCarbon Against External Measurements - Fischer (2025)
  • Green Prompt Engineering: Investigating the Energy Impact of Prompt Design in Software Engineering - De Martino et al. (2025)
  • From FLOPs to Footprints: The Resource Cost of Artificial Intelligence - Falk et al. (2025)
  • Beyond Counting Carbon: AI Environmental Assessments Struggle to Inform Net Impact Decisions - Cook et al. (2025)
  • Kareus: Joint Reduction of Dynamic and Static Energy in Large Model Training - Wu et al. (2026)
  • Where Do the Joules Go? Diagnosing Inference Energy Consumption - Chung et al. (2026)
  • Small Talk, Big Impact: The Energy Cost of Thanking AI - Delavande et al. (2026)
  • Understanding Efficiency: Quantization, Batching, and Serving Strategies in LLM Energy Use - Delavande et al. (2026)
  • From Attributional to Consequential LCA: Which Theoretical Framework for Assessing AI’s Environmental Impacts? - Ekchajzer et al. (2026)
  • Small Bottle, Big Pipe: Quantifying and Addressing the Impact of Data Centers on Public Water Systems - Han et al. (2026)

Survey Papers

🏢 Reports

  • The great challenges of generative AI (🇫🇷 only) - Data For Good 2023
  • General framework for frugal AI - AFNOR 2024
  • Powering Up Europe: AI Datacenters and Electrification to Drive +c.40%-50% Growth in Electricity Consumption - Goldman Sachs 2024
  • Generational Growth — AI/data centers’ global power surge and the sustainability impact - Goldman Sachs 2024
  • AI and the Environment - International Standards for AI and the Environment - ITU 2024
  • Powering artificial intelligence: a study of AI’s footprint—today and tomorrow - Deloitte 2024
  • Artificial Intelligence and Electricity: A System Dynamics Approach - Schneider Electric 2024
  • Developing sustainable Gen AI - Capgemini 2025
  • Exploring the sustainable scaling of AI dilemma: A projective study of corporations' AI environmental impacts - Capgemini Invent 2025
  • Intelligence artificielle, donnĂ©es, calculs : quelles infrastructures dans un monde dĂ©carbonĂ© (🇫🇷 only) - The Shift Project 2025
  • Measuring the environmental impacts of artificial intelligence compute and applications OECD 2025
  • Recommendation ITU-T L.1801 - Guidelines for assessing the environmental impact of artificial intelligence systems ITU 2026

Owner metadata


GitHub Events

Total
Last Year

Committers metadata

Last synced: 9 days ago

Total Commits: 125
Total Committers: 5
Avg Commits per committer: 25.0
Development Distribution Score (DDS): 0.032

Commits in past year: 19
Committers in past year: 2
Avg Commits per committer in past year: 9.5
Development Distribution Score (DDS) in past year: 0.053

Name Email Commits
Samuel Rincé s@r****e 121
Jae-Won Chung j****g@u****u 1
Flavien Lebarbé f****e@e****r 1
Even m****n@g****m 1
Cyril Beslay c****3 1

Committer domains:


Issue and Pull Request metadata

Last synced: about 1 month ago

Total issues: 1
Total pull requests: 8
Average time to close issues: about 2 months
Average time to close pull requests: 6 days
Total issue authors: 1
Total pull request authors: 5
Average comments per issue: 1.0
Average comments per pull request: 1.0
Merged pull request: 5
Bot issues: 0
Bot pull requests: 0

Past year issues: 0
Past year pull requests: 1
Past year average time to close issues: N/A
Past year average time to close pull requests: 14 days
Past year issue authors: 0
Past year pull request authors: 1
Past year average comments per issue: 0
Past year average comments per pull request: 2.0
Past year merged pull request: 0
Past year bot issues: 0
Past year bot pull requests: 0

More stats: https://issues.ecosyste.ms/repositories/lookup?url=https://github.com/samuelrince/awesome-green-ai

Top Issue Authors

  • rvandernoort (1)

Top Pull Request Authors

  • Eugene-Levinson (2)
  • flebarbe (2)
  • evenmatencio (2)
  • hongping-zh (1)
  • jaywonchung (1)

Top Issue Labels

Top Pull Request Labels

Score: 6.309918278226517