Environmental Watchdog Presses AI Industry on ChatGPT’s Hidden Costs Disclosure


  • Large AI models like ChatGPT have substantial environmental costs, consuming high energy during training and operation.
  • The environmental impact includes significant carbon emissions, water usage, and strain on data centers.
  • Transparency and responsible AI use are essential to address these concerns and promote sustainable practices.

AI has become an integral part of modern life, powering various applications, including chatgpt that assist in tasks ranging from information retrieval to creative content generation. While the benefits of AI are evident, emerging research highlights a less-discussed concern: the substantial environmental costs associated with large language models like ChatGPT and Bing Copilot.

Energy consumption of language model training

Large language models such as ChatGPT and Bing Copilot require extensive computational power and electricity during their training processes. According to a study conducted by researchers at the University of Washington, training a single large language model like ChatGPT-3 can consume up to 10 gigawatt-hours (GWh) of power. To put this into perspective, it’s roughly equivalent to the yearly electricity consumption of over 1,000 U.S. households. Moreover, the carbon footprint of training ChatGPT-3 can range from 55 to 284 tons of CO2, depending on the source of electricity.

Running language models and their environmental impact

While the energy consumption for running language models like Bing Copilot or ChatGPT is lower than that of training them, it still contributes to environmental concerns. The actual energy consumption depends on factors such as the model’s size, the number of tokens processed, and hardware and software efficiency. Estimates suggest that a single ChatGPT-4 query consumes between 0.001 and 0.01 kWh, which is significantly more than the energy consumed by a typical Google search query (0.0003 kWh).

The environmental impacts of using AI systems, including ChatGPT and Bing Copilot, are not negligible. As the demand for AI services grows, so does the need for data centers to house the servers and equipment that support these systems. Data centers are notorious for their energy consumption, which includes both running the hardware and managing power and cooling. Globally, data centers account for approximately 1-1.5% of electricity consumption and 0.3% of CO2 emissions. Additionally, these facilities use substantial amounts of water for both cooling and electricity generation.

A report by Livemint reveals that ChatGPT-3 alone consumes around 800,000 liters of water per hour. This is equivalent to the daily water needs of 40,000 people. These figures underscore the substantial environmental footprint of AI systems.

Reducing energy consumption and environmental impact

Improve Hardware and Software Efficiency: Enhancing the design and efficiency of hardware and software can reduce energy consumption. Techniques like liquid immersion cooling can help lower hardware heat and minimize carbon emissions and water usage in data centers.

Transition to Renewable Energy: Shifting to renewable energy sources such as wind, solar, and hydro can power data centers more sustainably. Countries with abundant natural resources, like Norway and Iceland, have already adopted this approach to lower their carbon footprint.

Responsible Use of AI: Limiting the use of AI models to meaningful and essential applications while avoiding trivial or harmful purposes can contribute to energy conservation and social responsibility. Focusing on educational or artistic content creation, rather than generating fake news or spam, can have positive societal impacts.

The future of energy consumption in AI

The future of energy consumption in AI appears promising, with advancements in technology leading to more energy-efficient AI models and data centers. The adoption of renewable energy sources is also expected to increase. However, as AI becomes more ubiquitous, a continued emphasis on reducing energy consumption and promoting sustainable practices is essential.

Despite the growing concern about the environmental impact of AI, obtaining accurate data remains a significant challenge. Existing research and emerging studies rely on estimated datasets and projections, as developers have not publicly disclosed the full extent of AI’s energy inputs, carbon emissions, and water footprints.

A study by Li et al. in 2023 highlighted that global AI demand could potentially lead to the withdrawal of 4.2 to 6.6 billion cubic meters of water in 2027, surpassing the total annual water withdrawal of half the United Kingdom. This underscores the pressing need for greater transparency in assessing AI’s environmental impact.

Call for greater accountability and transparency

To address these concerns effectively, there is a growing need for greater transparency regarding both operational and developmental emissions resulting from AI processes. Developers should disclose data related to water efficiency and provide comparisons of different energy inputs. Such transparency will enable informed decisions and assessments of the environmental impacts of language models like ChatGPT and Bing Copilot.

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Share link:

Emman Omwanda

Emmanuel Omwanda is a blockchain reporter who dives deep into industry news, on-chain analysis, non-fungible tokens (NFTs), Artificial Intelligence (AI), and more. His expertise lies in cryptocurrency markets, spanning both fundamental and technical analysis.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Related News

Higher ED
Subscribe to CryptoPolitan