(NOTE: This post represents my personal views about AI in my capacity as a private individual. It does not represent the views of any of my previous or current employers or promote their products or services.)
The announcement came as a surprise to me, though maybe it shouldn’t have.
The company I used to work for, a subsidiary of Household Name Big Tech Co., was pivoting away from its initial flagship product. We would still have and support the product, of course, it was (and is) used by millions of people every day. But it wouldn’t be our main focus anymore.
Instead, we were now going to focus on an “integrated, AI-powered” product.
There it was. AI.
Yet another tech buzzword. A fad like crypto, bitcoin, VR, and any number of other so-called innovations that dominated headlines for a day and have now been relegated to the growing graveyard of failed tech experiments.
Unfortunately, the costs of this fad are myriad and damaging to nearly everyone, especially those who can least afford them.
It is not an exaggeration to say that the negative effects AI has on the planet are massive. Indeed, the costs to the environment may be the worst consequence of AI.
Yet relatively few of the numerous articles, blog posts, videos, and other media posted about AI include much, if any, information about these problems. Those that do mention climate or the environment in conjunction with AI usually focus on how AI can help fight climate change by improving energy efficiency. But improvements in one area (efficiency) do not negate the damage in others (water, electricity, carbon). Indeed, as we will see, some of the world’s largest tech companies are working to improving energy efficiency while using more and more water, electricity, and carbon every year, and it’s largely due to the needs of AI.
One of the biggest problems is the amount of water AI requires. According to researchers Li, Yang, Islam, and Ren, training GPT-3 resulted in 700,000 liters of water evaporated, while it is estimated that global demand for AI in 2027 will lead to 4.2-6.6 billion–yes, billion–cubic meters withdrawn. (“Withdrawal” refers to freshwater removed from either the ground or the surface and then used for purposes of agriculture, industry, or municipality.)
Tech companies are among the main consumers of water today, large amounts of which are required for their products and services. For example, Google’s 2023 Environmental Report states that “[i]n 2022, total water consumption at our data centers and offices was 5.6 billion gallons”. This amount increased in 2023 to 6.4 billion gallons “primarily due to water cooling needs at [Google’s] data centers, which experienced increased electricity consumption year-over-year.” This is despite the fact that Google “prioritize[s] responsible water use in [its] data centers and [its] office operations around the world.”
Similarly, Microsoft stated in its 2022 Environmental Sustainability Report that its operations consumed almost 6.4 million cubic meters (1.7 billion gallons) of water in fiscal year 2022, an amount “proportional to our business growth year-over-year”. This amount increased in fiscal year 2023 to more than 7.8 million cubic meters (2.06 billion gallons) “in alignment with our business growth”.
True, both Microsoft and Google have committed to water replenishment projects that are supposed to replenish more water than they consume in the regions where they work. Microsoft also claims in its 2024 Environmental Sustainability Report that its “new datacenters are designed and optimized to support AI workloads and will consume zero water for cooling. This initiative aims to further reduce our global reliance on freshwater resources as AI compute demands increase,” but the report does not specify how cooling will work without water. Google, on the other hand, states that “we’ll continue using water cooling to improve our energy efficiency in certain geographies” but that they are “prioritizing responsible water use and water replenishment at new sites from the start.”
And it’s not just cooling that these companies need water for. Water is required for just about every part of AI from manufacture to disposal:
But the problem isn’t just that water is required.
It’s that these processes lead to 1) loss of some of the water via evaporation and 2) contamination of the remaining water that is then discharged, resulting in water that can contain “toxic chemicals and/or hazardous wastes, which need additional processing before reused [sic] for other purposes.”
It’s the sheer amount of water that is required.
It’s the fact that it’s largely potable freshwater in question, freshwater that plants, animals, and humans need to survive, and large parts of the planet are already experiencing frequent droughts.
It’s the fact that these companies know all of this, yet they’re proceeding with AI anyway. Indeed, Google describes itself as an “AI-first company” that has “made AI foundational to every part of [its] business and all Google products.”
Electricity has nearly become a buzzword in itself in the realm of climate change. Together with carbon emissions, it is also one of the factors that is usually mentioned on the rare occasions when the environmental costs of AI are discussed in the media. The result is often a recommendation, not to cease using or producing AI completely, but to do so in more energy-efficient ways.
However, it must be noted that both energy efficiency and renewable sources of energy “are still costly to the environment,” as noted by researchers Bender, Gebru, McMillan-Major, and Shmitchell. In 2020, for example, almost 14 million trees in Scotland were cut down and hundreds of thousands of acres of peatland dug up to make room for “wind farms” of electricity-generating turbines. In addition, greater efficiency can lead to a cycle of ever-greater demand and usage until further improvements become impossible and levels of demand and usage are higher than ever before.
The fact is that AI is already increasing usage of and demand for electricity, and this increase shows no signs of stopping or reversing. As MarketWatch reported earlier this year, it is currently estimated that electricity demand may increase from 8 terrawatt-hours in 2024 to 652 in 2030, with 1 terrawatt-hour equal to the consumption of 1 trillion watts of power for 1 hour.
652 trillion watts in only six years.
Are electricity grids ready for that kind of demand? In the US, states such as Texas and California are already experiencing rolling blackouts or are considering them as a last resort to avoid damaging grids. Can their grids–and others–be made ready in so short a time?
Google’s 2023 report does not state exactly how much electricity it uses, focusing instead on its carbon emissions due to high electricity usage. It does state that the company “buy[s] electricity directly from new wind and solar farms via long-term PPAs [power purchase agreements] on the grids where [Google] operate[s], and [the company] also buy[s] renewable power through utilities via renewable energy purchasing models that [Google] helped create.” Unfortunately, wind and solar power are not always available 24/7, so Google does still rely partly on carbon-based sources of energy. The 2024 report is much the same, but it does state that there was “growth in electricity demand” from 2022-2023. Microsoft’s reports are similar, with the 2022 report stating, “Electricity use accounts for the vast majority of Microsoft’s operational carbon emissions footprint.”
Which brings us to carbon.
In its 2023 Environmental Sustainability Report, Google reports that its emissions for 2022 were around 10.2M tCO2e (million metric tons of carbon dioxide equivalent). This amount increased in 2023 to 14.3M tCO2e. Microsoft’s numbers were even higher, coming in at approximately 12.9M tCO2e in 2022 and 16.7M tCO2e in 2023. Both companies support carbon removal projects in spite of numerous difficulties involved in such projects as well as evidence that carbon removal may be less effective than simply avoiding emissions altogether.
And unsurprisingly, one of the best ways to avoid emissions today is to avoid training and/or using AI models.
In 2019, University of Massachusetts at Amherst researchers Strubell, Ganesh, and McCallum estimated that, in terms of carbon emissions, training one model “is roughly equivalent to a trans-American flight,” while training another model would emit more than 626,155 pounds of carbon dioxide, very nearly five times the amount of carbon emitted by the average car.
In discussing the specialized hardware requirements of such a model, they stated:
“[E]ven when these expensive computational resources are available, model training also incurs a substantial cost to the environment due to the energy required to power this hardware for weeks or months at a time. Though some of this energy may come from renewable or carbon credit-offset resources, the high energy demands of these models are still a concern since (1) energy is not currently derived from carbon-neural [sic] sources in many locations, and (2) when renewable energy is available, it is still limited to the equipment we have to produce and store it, and energy spent training a neural network might better be allocated to heating a family’s home.”
And it is well known that it’s the people who can least afford these costs who are likely to be forced to pay them.
This brings up the issue of ethics regarding AI. As Strubell et al. ask: “Is it fair or just to ask, for example, that the residents of the Maldives (likely to be underwater by 2100) or the 800,000 people in Sudan affected by drastic floods pay the environmental price of training and deploying ever larger English LMs [learning models], when similar large-scale models aren’t being produced for Dhivehi or Sudanese Arabic?”
Originally published in The Quantastic Journal on Medium on August 12, 2024