Trading the Earth for Knowledge: the True Cost of AI

Photos by Nadely Y. Requena

By now you’ve most likely used, or at least heard of, artificial intelligence. The President at the University of Texas at Austin, Jay Hartzell, dubbed the year of 2024 as “the year of AI.” The idea of creating an automated system capable of thinking has fascinated mankind long before computers, as we know them today, were even invented. 

The father of modern computers, Alan Turing, created the “Turing Test” in which a human interacts with a machine. If the human can’t tell that they are interacting with a machine and not another human, the machine passes the test. Scientists have been working for decades trying to create machines that can pass the Turing Test, and they have just recently been extremely successful with AI models such as OpenAI’s ChatGPT. 

ChatGPT can do everything from explaining calculus in caveman terms to having a heartfelt conversation with you. While the implications of a machine passing this test can be scary, it can also be extremely exciting and beneficial to society and our pursuit of knowledge. But what is this wisdom worth? 

This knowledge is not contained in thin air. To gather this information, AI models must be given samples with known answers, so it can see how it should respond to certain prompts, just like how humans learn. To train ChatGPT-3, an even outdated and less powerful than the current ChatGPT model, OpenAI originally fed the model thousands of pages of information from books and web pages like Wikipedia. The training stage for AI models like ChatGPT is estimated to take around 1,300 megawatt-hours (MWh). To put this in perspective, that is roughly the same amount of energy used by 130 households in the U.S. over the course of a year. 

The inference stage, when AI generates answers to our questions, requires energy too. A joint research effort from Carnegie Mellon and Hugging Face, used AI to classify answers from written examples, generate text from a question and generate images from text prompts, tracking the energy usage for a thousand repetitions of each task. They found that to classify a thousand answers it took 0.002 kilowatt-hour (kWh), to generate a thousand pieces of text it took 0.047 kWh, and to generate a thousand images it took 2.907 kWh. While the results for the text-based answers aren’t extremely concerning, the results for generating images indicate that for each image generated, it takes as much energy as it does to charge the average smartphone from 0–100%. (Smartphones on average take 0.005 kWh to charge fully.)

To further capture the scale of these findings, we’ll look at the data centers that house the units that process our questions, such as graphics processing units (GPUs) and servers that generate and deliver responses. Data centers are fully developed buildings with independent cooling and power infrastructures. With the rise of AI usage to optimize search engines, translation applications and simple questions, it is predicted that OpenAI must have data centers to house 28,936 nVidia GPUs and 3,617 nVidia servers to keep up with demand. NVidia is a computer hardware company that manufactures the golden standard for AI hardware, but even with this high-end equipment, using this many units and servers implies an energy usage of about 564 MWh per day. That is the same amount of energy used by 50 households in the U.S. every year. 

And now, data centers located across the globe are beginning to work together. OpenAI recently extended their partnership with Microsoft to use Microsoft’s Azure platform and servers to help power ChatGPT and OpenAI’s other research. Worldwide, there are currently eight centers in the U.S., three in Europe and four in Asia. Moreover, Microsoft is even exploring the idea of installing a center under the North Sea near the Scottish Isles, with Project Natick.

While spreading out the centers alleviates some environmental strain in the area, these centers still use huge amounts of energy. In 2023, Microsoft and Google each used over 24 Terawatt-Hours (TWh) of energy, which is even more than some countries like Iceland and Azerbaijan. However, it must be noted Google has been carbon-neutral since 2007 and hopes to become carbon free by 2030, and Microsoft aims to become carbon-negative by 2030. Being carbon-neutral means that a company removes the same amount of carbon from the atmosphere as they release, and being carbon-negative means a company removes more carbon from the atmosphere than they use.

The closest data centers to Austin are Microsoft’s Azure data center in Northern San Antonio and Google’s data center in Ellis County, just south of Dallas. However, Elon Musk has recently acquired over 100,000 graphics cards, which are computer components that work to compute and process visual data. Musk plans to use these graphics cards to support his newly announced Cortex AI training super cluster located in Tesla’s operated Gigafactory, Texas, right next to the Austin-Bergstrom International Airport. Essentially, Musk is making an unfathomably large computer system to train more powerful AI models. The super cluster will be used to train Tesla’s self-driving AI technology, and it will initially cost 130 MWh of energy to cool, equivalent to 40,000 typical A/C units. After the center is fully completed and running, it is expected to need 500 MW of cooling energy in order to match projected demand. That is, if the Texas power grid can handle it. 

Infamously shaky, the Texas power grid is constantly withstanding stress, whether it be from trying to cool its 30 million residents in triple-degree-heat or warm them in freezing temperatures. After the grid broke down during the 2021 winter storm, leading to over 4.5 million houses and businesses without power, Texas is trying to expand and reinforce its power grid. But can we trust our state to properly supply power to its people before they supply Tesla? 

In the city of Palo Alto in California, it can be seen that Tesla is taking priority over the residents. The city had plans for a new energy infrastructure to supply its residents with energy more efficiently and for less cost. However, when Tesla announced that they were going to build a new AI headquarters in the city, the new energy grid plans had to be changed completely in order to accommodate Tesla’s power needs. Residents in the area saw an increase in energy prices while receiving less of the benefits, essentially paying partly for Tesla’s energy bill. 

Austin’s Cortex center may not only strain an already shaky power grid but also result in higher energy costs for the city's residents.

Once the center is built and operating as expected, the graphics units and servers will be performing thousands of operations a second while generating thousands of watts of heat. As mentioned earlier, the units and servers need to be cooled to maintain optimal performance, and this requires transferring the thermal energy elsewhere, which usually means releasing the heat outside. All buildings emit heat, that’s why cities create something called a “heat island,” where the atmosphere is typically warmer around urban areas than rural areas. But having buildings like data centers so close to cities, this can drastically increase the effects of heat islands. It is already evident in cities like Moscow where the heat island effect can entirely stop weather fronts from moving in, and the daytime temperatures can be around six degrees warmer than their rural counterparts.

But just because we are using artificial intelligence does not mean that Earth will become one of the fiery rings of Hell. There is hope! 

Organizations such as the United Nations are using AI to monitor environmental impacts and climate change, create valuable simulations and track key metrics like methane emissions, air quality, and environmental footprints. These examples mainly use AI to consolidate information and create predictions. Artificial intelligence is also being used to help areas like Sudan, Burundi, and Chad, which are particularly vulnerable to extreme weather patterns since they so heavily rely on local agriculture and farming. The UN is using AI to predict weather formations over these areas, so communities can better prepare for the changing climate. While there’s still a question of whether these positive uses outweigh the negative side effects, it can be comforting to know this power is being put to good use.

New technologies can be exciting. But as we watch the line between man and machine fade more and more, it is necessary for us to analyze whether it is worth blurring that line.

Next
Next

Honeybee