Artificial intelligence (AI) is becoming more prominent and more ingrained in day-to-day life, leading many sustainability-minded individuals to take a closer look at the carbon footprint of AI.
Does artificial intelligence create carbon emissions, and if so, how are those emissions measured, monitored, and mediated?
This article examines the environmental impact of training AI and explores how artificial intelligence can be used to work to the environment’s advantage instead of having the carbon footprint of AI just another problem for the planet.
What Is the Carbon Footprint of AI Technology?
Artificial intelligence (AI) technology is a rapidly growing field, and while AI tools can significantly reduce society’s reliance on carbon-intensive mechanics and processes, AI does not come without its own share of carbon emissions.
A recent article by Columbia Climate School examined the carbon footprint of AI.31
Note that the carbon footprint of AI technology equals the combined output of greenhouse gas emissions (GHGs) or carbon dioxide equivalent (CO2e) generated during the production of computing technology (e.g. chips) PLUS the training of AI models PLUS the CO2e emitted during the AI’s lifetime operations.
Carbon Footprint of AI = Carbon Footprint of Computing Technology Production + Carbon Footprint of Training AI + Carbon Footprint of Operating AI (Usage)
Most conversations about AI’s carbon footprint focus on the latter two categories as these are unique to AI, relative to other computation processes. Note that the computing technology used for AI is more complex and energy-intensive to produce.8
Each artificial intelligence model has a different carbon footprint, dependent upon the size and complexity of the model. The IT sector in general, is estimated to account for around 4% of global emissions, but this number is rapidly growing and AI’s portion of IT’s CO2e output is also expanding.
There are no published estimates as of yet for the AI industry’s total footprint, and the field of AI is exploding so rapidly that an accurate number would be nearly impossible to obtain. Looking at the carbon emissions from individual AI models is the gold standard at this time.
A 2022 blog by Google reported that as much as 15% of the company’s total energy consumption over a three-year period was used for machine learning, or artificial intelligence programs and tools.
The majority of the energy is dedicated to powering and cooling the hyperscale data centers, where all the computation occurs.22
What Is AI (Artificial Intelligence)?
Artificial intelligence (AI) is considered by many to be the future of technology, economy, national security and advancement, and potentially every aspect of society, but many people are still asking one fundamental question: “What is AI?”
The concept of artificial intelligence has been around since the 1950s but it has continually evolved, resulting in many widely accepted definitions of artificial intelligence in use today.
In brief, artificial intelligence (AI) can be defined as the science and technology of creating intelligent machines and programs that can conduct complex processes ordinarily reliant on human intellect.26
The National Artificial Intelligence Initiative Act of 2020 (H.R.6216)32 defines artificial intelligence as “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.”2
Artificial intelligence has become so prominent in scientific discussions and research over the last decade that national governments are establishing independent groups, such as the National Artificial Intelligence Initiative Office (NAIIO),33 to drive and regulate AI research and development.
Carbon Footprint of AI: Components of AI
There are various components of artificial intelligence and different sources tend to categorize or classify these uniquely.
Artificial intelligence primarily deals with the processes of reasoning, problem-solving, language processing, perception, and learning, and this leads to distinct, though interrelated, subfields within AI.
Depending on the information source, there are between 4 and 8 subfields of AI. The most consistently agreed upon are the following:13
- Machine Learning: Includes the subset deep learning. Involves the use of algorithms, datasets, and statistical models which allow machines to process information independently and make predictions and decisions.
- Robotics: Development of autonomous or semi-autonomous robots which rely on artificial intelligence to interact with their surroundings. Drones, robotic vacuums, and self-driving cars are all applications of AI robotics.
- Computer Vision: Development of machines which can process visual information to categorize and classify images or detect and track objects.
- Natural Language Processing: Development of machines which can comprehend human language, essential for applications such as translation and virtual assistants.
- Expert Systems: Development of machines and programs which are capable of making complex decisions using available data and inferential reasoning. Useful for a wide range of customer service applications, from healthcare analysis to financial advisory.
- Generative: Development of programs able to create a wide range of content from a given prompt, from realistic artwork to long-form fiction.6
Other, sometimes parsed out subfields include:
- Neural Networks
- Evolutionary Computation
- Fuzzy Logic
- Deep Learning
What Can Artificial Intelligence Do?
It seems there are few limits to what artificial intelligence can do. The current capabilities of AI are vast, and the potential to move beyond these is simply astonishing.
Here are a few of the ways that artificial intelligence is currently employed in daily life:4
Virtual Assistants
Anyone who has ever asked Alexa to play a favorite song, requested Google to pencil in a calendar event, or queried Siri for directions to a new destination has relied on a virtual assistant.
Furthermore, more websites are utilizing chatbots to handle customer service concerns in “real time.”
Smart Vehicles
Self-driving vehicles used to be a concept reserved for science fiction,34 but each day brings them closer to reality.
Companies such as Tesla and Toyota are designing vehicles with complex computer programs which rely on aggregated, real-time data from GPS, vehicle cameras, and police radar to provide information about routes and road hazards and assist the vehicle operator.
This technology has led to vehicles with AI-moderated cruise control, steering, braking, and lane changing.
Customer Experience Personalization
AI is used increasingly to improve user experience by tracking user data and making recommendations to appeal to a person’s interests.
This function of AI makes browsing, e-commerce, and social media more personalized and efficient.
Fraud Prevention and Spam Filtering
Financial institutions increasingly rely on AI to ensure data security and monitor fraud risks and cyber threats.
AI is also used at a more basic level to filter through spam mail.
Security Systems
Artificial intelligence is used in high-tech security systems, employing AI’s facial recognition capabilities to identify threats and detect suspicious activity.
Infrastructure
AI is also playing a more essential role in infrastructure and planning. Using artificial intelligence to monitor and analyze traffic patterns can help businesses transport goods more efficiently.
It can also help city planners with road design. On a smaller scale, AI is used to inform the layout of shops and markets by monitoring and evaluating the way customers move through various stores.
Smart Homes
Combining a myriad of AI technologies, smart homes are the future of home management, security, and maintenance. Artificial intelligence is commonly used to adjust the thermostat and operate devices in the home.
Home robots are also increasingly prevalent with robotic vacuums and lawn mowers handling regular chores around the home.4
These areas represent a handful of the ways in which AI is used each and every day.
However, artificial intelligence can do much more than this, and researchers are developing increasingly complex programs. These can:19
- Generate creative content that may be indistinguishable from human art
- Read brain signals and translate them into speech
- Use olfactory senses to identify smells connected with various disease processes
- Detect and identify sounds and pinpoint locations
How Does an Artificial Intelligence Work?
This all sounds incredible and maybe even unbelievable.
Yet, all of these processes are possible for AI, leaving many to wonder: “How does an artificial intelligence work?”
Explained as simply as possible, artificial intelligence works by combining data and algorithms to generate desired output. Thus, for AI to be effective, there must be a large dataset of reference points.
Additionally, the more complex the algorithmic programming used in AI development, the more the AI is capable of and the more closely it can approximate human behavior.
Artificial intelligence works by relying on its programmed algorithms to analyze and process data, drawing conclusions or generating new information.28
What Is Generative Artificial Intelligence?
Generative artificial intelligence is a rapidly expanding subfield in AI research. Where traditional models have focused on an AI’s ability to draw conclusions or make predictions from reference data, generative AI goes a step further to create new data or content.
Generative AI is much more complex than discriminative models which are limited to decision making and predictions.35
Generative AI can take a simple prompt (a sentence, an image, etc.), and using the set of programmed data and parameters, create something similar but original. The best generative AI tools are often capable of producing creative content that is nearly indistinguishable from human-generated content.18
The leading generative AI tools can be classified into several different categories, and many tools fit into multiple categories, but 3 primary categories in generative models include:12,14
- Generative Adversarial Networks (GANs): It is used for creating synthetic data and realistic images.
- Transformer-Based Models: It is primarily used for text generation.
a. Large Language Models (LLMs) - Variational Autoencoders (VAEs): It is used for image, audio, and video generation.
Other models include Neural Radiance Fields (NeRFs), diffusion models, and more.
What Is the Carbon Footprint of Training AI?
The carbon footprint of training AI models depends upon the size of the model. The more complex the model is (in other words, the larger the data sets, more numerous parameters, and more complicated algorithms), the more energy is required to train the model.
The energy consumption leads directly to the overall carbon footprint of the training process.
There is a very wide range of emissions from model to model, and some sources indicate that training a large language model can result in anywhere from 200 to 600 metric tons of CO2e. Open AI’s GPT-3, for example, generated 552 metric tons of CO2e during training.9
Google explored the carbon footprint of their own AI models and reported that approximately 40% of the company’s AI energy consumption occurs during the training process while 60% is used during “inference,” or the AI’s daily operations.22 Other sources suggest a much more lop-sided distribution, with as much as 90% of energy consumption used during inference.
Still, training emissions cannot be discounted. CO2e associated with training GPT-3 alone are more than triple the emissions resulting from a passenger jet flight across the U.S. and back again!23
Another important issue to note is how quickly an artificial intelligence can become outdated, requiring retraining and modification or an entirely new model.
What Is the Carbon Footprint of Operating AI?
Somewhere between 60% and 90% of an AI model’s energy demand is accounted for by model inference, the operational phase of its life. Of course, the amount of CO2e to which this energy demand translates depends entirely on the energy efficiency of the data centers themselves.
Most sources agree that AI models use most of their energy during inference, but what does that actually look like in numbers?
Per Google, a single AI Google search emits 3,223 kg CO2e, but this is a much lower number than what a data center run on less sustainable energy would generate.22 GPT’s daily CO2e is estimated at 50 pounds, resulting in an annual carbon footprint of 8.4 tons of CO2e.8
What Is a Carbon Footprint?
Anyone who has ever wondered “What is a carbon footprint?” can look to the United States Environmental Protection Agency (EPA)’s definition.
It explains the carbon footprint as the amount of greenhouse gasses (carbon dioxide, nitrous oxide, methane, etc.) emitted annually by a specific entity (from individual to corporate).
What Is the Carbon Footprint of the Internet?
The carbon footprint of the internet may seem like a small topic, but emissions from internet infrastructure, computing, and usage are increasing all the time.
The internet’s contribution to global emissions was recently estimated at 3.7% of total greenhouse gas emissions, a percentage expected to increase 2-fold by 2025 and nearly 4-fold by 2040.27
Comparing the Carbon Footprint of AI Models (AI Energy Consumption and CO2e)
As artificial intelligence increases in complexity (such as the exponentially proliferating parameters with language models), it requires more computing power, increasing energy demands and consumption at data centers. Unfortunately, AI’s energy costs are not consistently tracked or reported, and there is a lot of missing data in this field.
The following table takes a look at AI energy consumption and compares the carbon footprint of AI models using the limited data available.
Carbon Footprint of AI Models | |||||
Large Language Model | Parent Company | Launch Date | # of Parameters | Training Energy Consumption | Inference Energy Consumption |
GPT-1 | OpenAI | 06/11/2018 | 150 million | ||
GPT-28 | OpenAI | 02/14/2019 | 1.5 billion | ||
GPT-39 | OpenAI | 06/11/2020 | 175 billion | 1,287 MWh
552 metric tons CO2e |
50 lbs CO2e daily
8.4t CO2e annually |
GPT-4 | OpenAI | 03/14/2023 | 1.7 trillion | ~60,000 MWh
~13,000 metric tons CO2e (on grid electricity) ~1,100 metric tons CO2e (on sustainable energy) |
|
Bard29 | 03/21/2023 | 137 billion | 312 MWh | ||
Bing (ChatGPT)1 | Microsoft/ OpenAI | 17 billion | 7,200 MWh | 3,223 kg CO2e per search | |
Meena23 | 12/2019 | 2.6 billion | 232 MWh
96.4t CO2e |
||
T523 | 09/2019 | 11 billion | 85.7 MWh
46.7t CO2e |
||
Switch Transformer | 10/2020 | 1.6 trillion | 179 MWh
59t CO2e |
||
LLaMa30 | Meta | 02/24/2023 | 65.2 billion | 2,638 MWh
1,015t CO2e |
* NOTE that the data presented is not uniform and includes varying parameters.17
As the above table illustrates, with each subsequent iteration of language models, the parameters become more numerous and refined. Compare OpenAI’s first large-scale language model, GPT-1, which was issued in 2018 with approximately 150 million parameters with GPT-2 launched just 8 months later with an incredible 1.5 billion parameters!5
This is a drastic and incredibly rapid upscale in model size.
Yet, the growth and refinement of language models have not slowed by any means. GPT-3 was released in mid-2020 with 175 billion parameters, and some sources indicate that GPT-4 has more than a trillion parameters!8,10
Considering the tremendous size of the dataset GPT-4 relies upon, it makes sense that training the GPT-4 model would be the most time-intensive, cost-intensive, and energy/carbon-intensive to date.
How Big Is ChatGPT?
ChatGPT is a relatively new AI tool, introduced in late 2022, that appears to be taking the tech world and society by storm. ChatGPT is an artificial intelligence chatbot developed on OpenAI’s GPT-3 large language model framework.
Newer versions run on the GPT-3.5 or GPT-4 language models.
ChatGPT is revolutionary because it is a tool that can easily be used by the general public rather than just the scientific community or large corporations. It is also pivotal as it runs on some of the most complex language modeling currently available.
This incredible chatbot allows an individual to input a query and receive a direct response which may include follow-up questions.16
OpenAI’s chatbot appears to be wildly popular, but exactly how big is chatGPT? Well, it is BIG, containing 570 gigabytes of text data. Within its first week, ChatGPT secured over a million users, and it is estimated that more than 100 million people actively use the site.
ChatGPT is so popular that some estimates suggest it costs more than $700,000 daily to run.5
ChatGPT Energy Consumption
ChatGPT is becoming a household name. The popular chatbot runs on large language models developed by the company OpenAI and runs on Microsoft’s Azure cloud.
When considering ChatGPT’s energy use, first consider the energy costs of training the large language model it runs on.
According to a University of Washington professor in computer engineering, the energy costs for training a language model this large can amount to as much as 10-gigawatt hours (GWh) of electricity. This energy consumption is primarily due to the amount of computational power in play at data centers during model training as well as the power used by cooling systems needed to maintain computer temperatures.
Putting this into perspective, this amount of energy could likely power more than 1,000 homes in the U.S. for an entire year!20
ChatGPT’s original version operated on the AI model GPT-3, a complex natural language processing model (NLP) with 175 billion parameters. Newer versions rely on GPT-4, and NLP with an alleged 1.7 trillion parameters.
Note that the number above provides an estimate of the energy used during the training of a ChatGPT language model, but ChatGPT energy consumption only intensifies during the operational phase. According to one article, ChatGPT’s computing power relies on 3,500 supercomputers and 30,000 graphics processing units (GPUs).5
Daily ChatGPT requests are believed to number in the hundreds of millions. To process this incredible volume of queries, ChatGPT may use approximately 1 GWh of electricity daily.
The UW professor compares this electricity consumption to the amount of energy required to power more than 30 thousand homes in the U.S. for a single day!20
How Much Did It Cost To Train GPT 3?
For any person who has ever wondered “How much did it cost to train GPT 3?,” yes, there are also significant monetary costs associated with training AI models.
OpenAI’s previous generation large language model, GPT-3, cost an approximately $4.6 million to develop and train.24 However, this number seems small compared to the whopping $100 million used to train the vastly larger 2023 model, GPT-4.
AI Environmental Impact: AI’s Impact on Climate Change (Artificial Intelligence)
When it comes to climate change, artificial intelligence’s impact is bigger than most people realize. The conversation about AI’s environmental impact is gaining traction as the field swells with new research and applications.
The most concerning aspect of AI is how quickly the field is growing. With market expansion nearing 40% annually, technology efficiency improvements will be hard-pressed to keep up.
The other concerning aspect of AI models is their increasing complexity and the quest to reduce the margin of error in modeling. One study reported that the training required to reduce a model’s margin of error by 6.5 percent could consume as much energy as New York City uses in an entire month!25
This information is startling because it is readily apparent that artificial intelligence growth will not be slowing down in the near future. The quest for larger, more accurate models is likely to continue its forward push, consuming vast amounts of energy.
On this trajectory, AI’s climate change impact could become insurmountable.
Can We Eliminate the Carbon Footprint of AI? (AI Optimization and Sustainability AI)
Eliminating the carbon footprint of AI will be very difficult if the market meets its projected compound annual growth rate of 37.3% between 2023 and 2030.3 Although this is a monetary statistic, it represents the rate of growth for the overall industry.
This means that climate researchers and AI researchers must be more proactive now than ever to stay ahead of AI’s increasing energy demand. Up to now, increased efficiency in the IT sector has negated the exponentially increasing computational demands.
However, this doesn’t mean that technological advancement in this area can continue to outstrip the growth.
Researchers at Google have proposed four ways to reduce and even eliminate the carbon footprint of AI, which they call the “4Ms.” The 4Ms include:
- Machine: The use of specialized systems and processors reduces energy demand.
- Model: Use of optimized machine learning models to decrease computation and redundancy.
- Map Optimization: Use cleaner data centers to reduce overall emissions associated with computation.
- Mechanization: Use cloud-based computation that is more energy efficient than on-site data centers.22
Microsoft’s roadmap for greener AI technology is similar to that of Google. The company elaborates on how meta-learning techniques enable machines to learn from their prior experience to become more efficient with successive iterations.
In other words, artificial intelligence can and should be programmed to teach itself to achieve higher levels of efficiency over time.11
According to the International Telecommunication Union (ITU), AI models do indeed use most of their energy during inference rather than training. This means that eliminating data center dependency on fossil-fuel-derived electricity is critical for sustainability AI, or the prospect of eliminating the carbon footprint of artificial intelligence.15
Not all data centers are created alike. Large-scale data centers require more power to maintain operations due to their sheer size, but they are much more efficient than smaller data centers and colocation centers which typically are not equipped with the newest technology.
A measure of data center efficiency, the Power Usage ffectiveness ratio (PUE), represents how much of a data center’s total energy usage is dedicated to actual computation (as opposed to water transport, cooling processes, fire prevention, etc.). Hyperscale data centers average approximately 1.57 PUE, but Google has reportedly lowered their PUE to 1.1 by optimizing materials and cooling processes.
This is critical, because greener data center means greener AI model inference.15
Another important point made by the ITU – which ties in with Google’s ‘Model’ criterion – is that there is a tradeoff between accuracy and energy use. Larger and more complex systems are indisputably more accurate, but at an energy cost that may be too high.
Striking the proper balance between accuracy and efficiency for AI optimization is an important consideration.15
Can We Use AI To Reduce Carbon Footprint?
The carbon footprint of AI is definitely concerning, particularly with the growth trajectory forecasted for the next decade.
The good news, aside from the fact that artificial intelligence can be made greener, is that scientists can use AI to reduce carbon footprints of human activity.
Artificial intelligence is becoming more integral to the fight against climate change. AI can process huge amounts of climate data from satellites and generate new climate models which more accurately predict climate patterns and events.
This information is then used to inform climate policy and regulations.
Climate modeling is the most obvious way that AI can mitigate humanity’s environmental impact, but there are many other ways that researchers are utilizing artificial intelligence to make society greener. Some examples include using AI to:7,8
- Plan and optimize flight routes for air travel to reduce climate-impacting contrails.
- Design and develop lighter and more efficient materials.
- Study and optimize the lifespan of lithium batteries used to power electric devices and vehicles.
- Improve efficiency in production and industry.
- Identify, sort, and collect recyclable material.
- Predict and detect wildfires and other natural disasters sooner.
- Maximize water usage and conservation.
With society’s increasing reliance on artificial intelligence, the modern world is moving away from manual mechanisms and towards automation. In many ways, AI makes for more efficient and functional processing.
However, there are also some unseen costs associated with artificial intelligence, such as the environmental impact of training AI.
By understanding how AI works and the real power needed to make it operable, measuring the carbon footprint of AI is possible, which also allows it to be mitigated using offset strategies that erase the carbon naturally.
Frequently Asked Questions About Carbon Footprint of AI
What Is a Carbon Emissions Calculator?
A carbon emissions calculator or carbon footprint calculator is a tool which can help individuals approximate their environmental impact by estimating the carbon emissions generated in different spheres of life, from travel, to work, to home. Carbon footprint calculation can provide information on carbon emissions for an individual, a company, or even a specific product.
How Do Individual Carbon Footprint Trackers Work?
Individual carbon footprint trackers generally work by using certain pieces of information gathered from the individual (Ex. size and location of home, monthly energy bill, type of vehicle, number of pets, miles traveled weekly, etc.) combined with average emissions in these areas to generate carbon footprint estimates. For the most part, they eliminate the need to learn how to calculate carbon footprint manually.
References
1Al, K. (2023, February 18). The AI Search Wars: Measuring the CO2 Emissions of Bing’s ChatGPT and Google’s AI Bard. LinkedIn. Retrieved August 28, 2023, from <https://www.linkedin.com/pulse/ai-search-wars-measuring-co2-emissions-bings-chatgpt-googles-al->
2United States Department of State. (2023). Artificial Intelligence (AI). State Department. Retrieved August 24, 2023, from <https://www.state.gov/artificial-intelligence/>
3Grand View Research, Inc. (2023, July 3). Artificial Intelligence Market to Hit $1811.75 Billion by 2030: Grand View Research, Inc. Bloomberg.com. Retrieved August 27, 2023, from <https://www.bloomberg.com/press-releases/2023-07-03/artificial-intelligence-market-to-hit-1-811-75-billion-by-2030-grand-view-research-inc>
4Biswal, A. (2023, August 21). Top 18 Artificial Intelligence (AI) Applications in 2023. Simplilearn.com. Retrieved August 24, 2023, from <https://www.simplilearn.com/tutorials/artificial-intelligence-tutorial/artificial-intelligence-applications>
5Brandl, R., & Ellis, C. (2023, July 19). ChatGPT Statistics and User Numbers 2023 – OpenAI Chatbot. Tooltester. Retrieved August 24, 2023, from <https://www.tooltester.com/en/blog/chatgpt-statistics/>
6Burns, E. (2023, July). What is Artificial Intelligence and How Does AI Work. TechTarget. Retrieved August 24, 2023, from <https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence>
7CBS News. (2023, August 27). AI has a giant carbon footprint. Can the technology also fight climate change? WTOP. Retrieved August 27, 2023, from <https://wtop.com/tech/2023/08/ai-has-a-giant-carbon-footprint-can-the-technology-also-fight-climate-change/>
8Cho, R. (2023, June 9). AI’s Growing Carbon Footprint. State of the Planet. Retrieved August 24, 2023, from <https://news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/>
9Cooper, C. (2023, May 19). Artificial Intelligence and Carbon Footprint — is AI destroying our planet? LinkedIn. Retrieved August 28, 2023, from <https://www.linkedin.com/pulse/artificial-intelligence-carbon-footprint-ai-our-planet-colin-cooper>
10Edwards, N. (2023, August 22). GPT-4. Wikipedia. Retrieved August 24, 2023, from <https://en.wikipedia.org/wiki/GPT-4>
11Microsoft. (2022, May 24). Emit less carbon from AI. Microsoft. Retrieved August 27, 2023, from <https://www.microsoft.com/en-us/research/project/reducing-ais-carbon-footprint/articles/emit-less-carbon-from-ai/>
12Great Learning. (2023, June 16). Generative AI Models. Great Learning. Retrieved August 28, 2023, from <https://www.mygreatlearning.com/blog/generative-ai-models/>
13Gülen, K. (2023, April 4). Basic Components Of Artificial Intelligence. Dataconomy. Retrieved August 23, 2023, from <https://dataconomy.com/2023/04/03/basic-components-of-artificial-intelligence/>
14Hiter, S., & Maguire, J. (2023, June 26). What Is a Generative AI Model? – Artificial Intelligence. eWEEK. Retrieved August 28, 2023, from <https://www.eweek.com/artificial-intelligence/generative-ai-model/>
15ITU News. (2022, September 23). How to reduce the carbon footprint of advanced AI models. ITU. Retrieved August 28, 2023, from <https://www.itu.int/hub/2022/09/how-to-reduce-the-carbon-footprint-of-advanced-ai-models/>
16OpenAI. (2022, November 30). Introducing ChatGPT. OpenAI. Retrieved August 27, 2023, from <https://openai.com/blog/chatgpt>
17Ludvigsen, A. (2023, July 18). The carbon footprint of GPT-4. Towards Data Science. Retrieved August 28, 2023, from <https://towardsdatascience.com/the-carbon-footprint-of-gpt-4-d6c676eb21ae>
18Marr, B. (2023, July 24). The Difference Between Generative AI And Traditional AI: An Easy Explanation For Anyone. Forbes. Retrieved August 25, 2023, from <https://www.forbes.com/sites/bernardmarr/2023/07/24/the-difference-between-generative-ai-and-traditional-ai-an-easy-explanation-for-anyone/>
19Marr, B., & Danise, A. (2019, November 11). 13 Mind-Blowing Things Artificial Intelligence Can Already Do Today. Forbes. Retrieved August 25, 2023, from <https://www.forbes.com/sites/bernardmarr/2019/11/11/13-mind-blowing-things-artificial-intelligence-can-already-do-today/>
20McQuate, S. (2023, July 27). Q&A: UW researcher discusses just how much energy ChatGPT uses. University of Washington. Retrieved August 24, 2023, from <https://www.washington.edu/news/2023/07/27/how-much-energy-does-chatgpt-use/>
21Our World in Data. (2023). Parameters in notable artificial intelligence systems. Our World in Data. Retrieved August 28, 2023, from <https://ourworldindata.org/grapher/artificial-intelligence-parameter-count>
22Patterson, D. (2022, February 15). Good News About the Carbon Footprint of Machine Learning Training. Google AI Blog. Retrieved August 24, 2023, from <https://ai.googleblog.com/2022/02/good-news-about-carbon-footprint-of.html>
23Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L.-M., Rothchild, D., So, D., Texier, M., & Dean, J. (2023). Carbon Emissions and Large Neural Network Training. Cornell University – arXiv. Retrieved August 24, 2023, from <https://arxiv.org/ftp/arxiv/papers/2104/2104.10350.pdf>
24Pocock, K. (2023, August 11). How much did GPT-3 cost? PC Guide. Retrieved August 25, 2023, from <https://www.pcguide.com/apps/gpt-3-cost/>
25Thompson, N., Greenewald, K., Lee, K., & Manso, G. (2021, October). Deep Learning’s Diminishing Returns: The Cost of Improvement is Becoming Unsustainable. IEEE Spectrum, 58(10), 50-55. 10.1109/MSPEC.2021.9563954. Retrieved August 23, 2023, from <https://ieeexplore.ieee.org/document/9563954>
26IBM. (2023). What is Artificial Intelligence (AI)? IBM. Retrieved August 23, 2023, from <https://www.ibm.com/topics/artificial-intelligence>
27Williamson, K. (2023, July 28). Website Carbon Footprint Calculator: Carbon Footprint of the Internet (By Site). 8 Billion Trees. Retrieved August 22, 2023, from <https://8billiontrees.com/carbon-offsets-credits/website-carbon-footprint/>
28Wilson, C. (2023, April 4). How AI Works: The Basics You Need to Know. HubSpot Blog. Retrieved August 24, 2023, from <https://blog.hubspot.com/marketing/how-does-ai-work>
29TRG Datacenters. (2023, July 26). AI Chatbots: Energy usage of 2023’s most popular chatbots (so far). TRG Datacenters. Retrieved August 28, 2023, from <https://www.trgdatacenters.com/ai-chatbots-energy-usage-of-2023s-most-popular-chatbots-so-far/>
30Ludvigsen, A. (2023, March 3). Facebook disclose the carbon footprint of their new LLaMA models. Kasper Groes Albin Ludvigsen. Retrieved August 28, 2023, from <https://kaspergroesludvigsen.medium.com/facebook-disclose-the-carbon-footprint-of-their-new-llama-models-9629a3c5c28b>
31Cho, R. (2023, June 9). AI’s Growing Carbon Footprint. Columbia Climate School. Retrieved August 30, 2023, from <https://news.climate.columbia.edu/2023/06/09/ais-growing-carbon-footprint/>
32Congress.gov. (2020, March 12). H.R.6216 – National Artificial Intelligence Initiative Act of 2020. Congress.gov. Retrieved August 30, 2023, from <https://www.congress.gov/bill/116th-congress/house-bill/6216>
33National Artificial Intelligence Initiative Office. (2023). The National Artificial Intelligence Initiative Office (NAIIO). National Artificial Intelligence Initiative Office. Retrieved August 30, 2023, from <https://www.ai.gov/naiio/>
34United States Environmental Protection Agency. (2023, May 18). Self-driving Vehicles. United States Environmental Protection Agency. Retrieved August 30, 2023, from <https://www.epa.gov/greenvehicles/self-driving-vehicles>
35Photos by SuttleMedia, TheDigitalArtist, geralt, Pixaline, tungnguyen0905, and Willi-van-de-Winkel. Pixabay. Retrieved from <https://pixabay.com>