top of page
작성자 사진laurensuh2007

The Environmental Impact of AI

By: Seoyoon Jin


ChatGPT. Rank Comfort, rankcomfort.com/unlocking-chatgpt-quick-guide/. Accessed 26 June 2024.


In recent months, AI has undoubtedly had significant advancements in its development and quality, causing much buzz around it. There have been many concerns surrounding AI, including privacy violations, artistic credit, algorithmic bias based on unreliable sources, and even about AI ‘taking over humanity’. However, one concern about AI that most people forget is the environmental impact that it poses.


This may come to be surprising for some people. When most people imagine using AI, they usually visualize tapping on a computer screen containing an AI website like ChatGPT. And if that were to be the case, why would tapping on a computer harm the environment? 


However, AI requires much more than a simple click on a computer screen. In fact, in order for big AI companies such as OpenAI, Microsoft, or Nvidia to provide their AI models to the public, many components, often requiring large magnitudes of energy, come into play.


Training AI


Training an AI model takes up a significant amount of energy. In fact, in 2019, researchers at University of Massachusetts Amherst found that training a single AI model emits over 626,000 pounds of carbon dioxide; equal to the emissions of five cars. But what exactly takes up so much energy?


AI companies first start training the AI model by exposing it to enormous amounts of data; all from thousands of different sources, including books, articles, the internet, or even social media. 


Then, over weeks or months, the model learns to create relationships or patterns in the information given. The AI model does this through something called a parameter. These are internal variables in the model that allows the AI to learn from the data and make accurate predictions. When more parameters are used in an AI model, the more accurate and flexible they become. As a result, AI companies tend to use more and more parameters in order to improve their model.


In fact, in 2018, a language model with 100 million parameters was considered to be quite large and ‘high tech’. However, in 2019, GPT-2, ChatGPT’s second version of the model, had 1.5 billion parameters. Then in 2022, GPT-3 was discovered to have 175 billion parameters; 100 times larger than what was considered ‘high tech’ in 2018. As AI becomes further developed, the energy needed to provide for these parameters also increases. 


Another component of creating an AI model are the tens of thousands of advanced computer chips. These chips are required in an AI model in order for it to process and analyze data. More specifically, a certain type of computer chip is commonly used in an AI model. These chips are called graphics processing units (GPU’s), which are specialized electronic circuits that are commonly used because they can process information and calculation simultaneously. Unfortunately, these specific types of chips consume much more energy than any other kind of computer chip, as more power is needed to calculate simultaneously. With tens of thousands of these energy consuming computer chips combined with its training process, it is no wonder that AI requires so much energy.


Maintaining and Operating AI


In order to maintain and operate an AI, data centers have to run 24/7. Most of the data center’s energy is used to operate chips and processors, as well as providing massive air conditioners to cool down the heat-producing servers. These data centers derive their energy from fossil fuels; sources of non-renewable energy that account for large quantities of carbon dioxide. These centers take up so much energy that it accounts for 2.5 to 3.7 percent of the global greenhouse gas emissions, exceeding that of the aviation industry.


Inference


Inference, the mode where the AI makes predictions based on the information the user has presented, consumes more energy than the amount it takes to train the AI. When Google estimated the amount of energy used for training and inference, it found that only 40 percent of energy is consumed during training, and the remaining 60 percent is used for inference. 


This is significant for large AI models such as GPT-3. For ChatGPT, GPT-3’s yearly carbon footprint was estimated to be equivalent to 8.4 tons of CO2. 60 percent of that is quite significant. 


Inference takes so much energy due to the hundreds of millions of users that input requests in a single AI, which people usually choose to use ChatGPT. These millions of users are eager to use AI for anything, as it is a relatively new piece of technology that is viewed as fascinating or intriguing. No matter the importance of the request, one single request that a user inputs consumes 100 times more energy than a simple google search. 


As AI continues to develop and more sophisticated models are created, more people will start to use them and AI carbon footprints will also inevitably grow.


What Can We Do?


In order to lessen the carbon footprint of AI, AI companies can apply renewable energy sources to power their models, find new methods to cool their data centers, make their servers more efficient, or using pre-trained models.


However, you are most likely someone that does not quite have the power to make these types of changes. So, what are some things that common individuals, such as yourself, can do about this?


First off, awareness is always important about bringing change. Most people are unaware about the environmental harm that AI poses, possibly even the companies that create them. So, let the people around you know about this issue, and even encourage AI companies to use environmentally friendly methods.


Moreover, a simple habit change such as using AI less frequently and instead using search engines like Google can reduce energy consumption. Like stated before, a single AI request, no matter how insignificant, consumes 100 times more energy than a simple google search. To put this in application, instead of asking ChatGPT, “How many ounces in a gallon,” ask Google, or any other search engine the exact same question. Most likely, you will receive a similar answer.


Conclusion


As AI becomes increasingly more developed, the carbon footprint of AI will also continue to grow. Our previous endeavors to decarbonize our planet could be disrupted by this newly presented issue, so it is crucial to bring awareness to this issue and prevent it from intensifying. 


It is our duty to protect our only home, so awareness and action needs to be initiated.




Works Cited:


ChatGPT. Rank Comfort, rankcomfort.com/unlocking-chatgpt-quick-guide/. Accessed 26 June 2024.

Cho, Renée. "AI's Growing Carbon Footprint." State of The Planet, edited by Columbia Climate School, 9 June 2023, news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/. Accessed 26 June 2024.

Did you know, training a single AI model can emit as much carbon as five cars in their lifetimes? 5 Tips to Reduce the Environmental Impact! Super Micro, © 2024 Super Micro Computer, www.supermicro.com/en/article/ai-training-5-tips-reduce-environmental-impact. Accessed 26 June 2024.

Kanungo, Alokya. "The Green Dilemma: Can AI Fulfill Its Potential Without Harming the Environment?" Earth.org, 18 July 2023, earth.org/the-green-dilemma-can-ai-fulfil-its-potential-without-harming-the-environment/. Accessed 26 June 2024.





조회수 12회댓글 0개

최근 게시물

전체 보기

Opmerkingen


bottom of page