- Grok AI ranks as the most environmentally friendly AI chatbot with just 0.17 grams of CO₂ emissions per query.
- AI models show significant variation in carbon footprint, with GPT-4 producing 25 times more carbon emissions per query than the most efficient model.
- Large language models’ environmental impact correlates with their computational complexity, with search-integrated models generally showing higher emissions.
Experts at TRG Datacenters have analyzed the estimated emissions of various AI models, from training to real-time responses. Carbon emissions were calculated based on standard energy grid assumptions and expressed in grams of CO₂ equivalent per individual query. To provide context, emissions were compared to everyday digital activities like email, video streaming, and smartphone charging. Since AI infrastructure and energy sources vary, all figures are approximate and intended to provide a general perspective on the sustainability of these technologies.
AI Model | Estimadted Inference CO₂ per Query |
Grok AI | 0.17 |
Google Gemini | 1.6 |
LLaMA (Meta AI) | 3.2 |
Claude AI | 3.5 |
Perplexity AI | 4 |
ChatGPT (GPT-4) | 4.32 |
To access the full research, please follow this link.
Grok AI is the most environmentally friendly AI chatbot with an exceptionally low carbon footprint of just 0.17g CO₂ per query. The model optimizes efficiently, using minimal computational resources while maintaining functionality. Its architecture appears specifically designed to reduce power consumption, making it a standout performer in environmental sustainability among major AI chatbots. This minimal impact is comparable to making a single Google search.
Google Gemini ranks 2nd with 1.6g CO₂ emissions per query. The model benefits substantially from Google’s specialized AI hardware infrastructure and significant investments in renewable energy. Despite being significantly more carbon-intensive than Grok, Gemini operates at less than half the emissions of most competitors, highlighting Google’s focus on computational efficiency. The environmental impact equals watching a 10-minute YouTube video.
LLaMA (Meta AI) ranks 3rd with 3.2g CO₂ per query. Meta’s renewable energy commitments contribute to LLaMA’s relatively better performance compared to some competitors. The carbon footprint assessment identifies a concerning trend of increasing energy demands as Meta expands its AI operations, potentially affecting future emissions profiles. Using LLaMA is equivalent to sending 10 simple emails.
Claude AI comes in 4th with 3.5g CO₂ emissions per query. While slightly more efficient than GPT-4, Claude’s architecture prioritizes safety and reliability features that appear to increase computational demands. The model represents a middle ground in the emissions spectrum, balancing performance capabilities with moderate environmental impact. This carbon cost equals streaming a 10-minute YouTube video plus sending an email.
Perplexity AI lands in 5th with 4g CO₂ per query. The model’s integrated search capabilities contribute to its higher energy consumption profile. Emissions vary based on query complexity, with more elaborate searches driving higher carbon costs. Perplexity demonstrates how adding functionality beyond basic text generation increases environmental impact. Each query has the carbon equivalent of charging a smartphone 1-1.5 times.
ChatGPT (GPT-4) rounds out the ranking with the highest emissions at 4.32g CO₂ per query. As one of the most computationally intensive models studied, GPT-4 requires substantial processing power to generate responses. Its deep learning complexity creates an environmental cost 25 times greater than the most efficient competitor, highlighting the significant sustainability challenges facing advanced AI systems. A single GPT-4 query generates carbon equivalent to sending 21 emails or nearly a full phone charge.
A spokesperson from TRG Datacenters commented on the study: “As AI adoption continues to rise, finding ways to reduce its energy consumption will be key. Some models are already designed to be more efficient, but there is still room for improvement. Advances in hardware, more optimized AI models, and increased use of renewable energy in data centers could help lower emissions over time. AI is here to stay, but balancing innovation with sustainability will be essential in minimizing its environmental impact.”