Determining the precise energy usage for creating a single Balenciaga pope isn’t a straightforward calculation. However, we do have some insight into the actual energy expenditure of AI.
Itâs common knowledge that machine learning consumes a lot of energy. All those AI models powering email summaries, regicidal chatbots, and videos of Homer Simpson singing nu-metal are racking up a hefty server bill measured in megawatts per hour. But no one, it seems â not even the companies behind the tech â can say exactly what the cost is.
The Best Wireless Headphones for 2024
What is the energy consumption of AI?
Estimates do exist, but experts say those figures are partial and contingent, offering only a glimpse of AIâs total energy usage. This is because machine learning models are incredibly variable, able to be configured in ways that dramatically alter their power consumption.
Moreover, the organizations best placed to produce a bill â companies like Meta, Microsoft, and OpenAI â simply arenât sharing the relevant information. (Judy Priest, CTO for cloud operations and innovations at Microsoft said in an e-mail that the company is currently âinvesting in developing methodologies to quantify the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application.â OpenAI and Meta did not respond to requests for comment.)
One important factor we can identify is the difference between training a model for the first time and deploying it to users. Training, in particular, is extremely energy intensive, consuming much more electricity than traditional data center activities. Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes.
What is the energy consumption of AI?
To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means youâd have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3. But itâs difficult to say how a figure like this applies to current state-of-the-art systems. The energy consumption could be bigger, because AI models have been steadily trending upward in size for years and bigger models require more energy.
On the other hand, companies might be using some of the proven methods to make these systems more energy efficient â which would dampen the upward trend of energy costs. The challenge of making up-to-date estimates, says Sasha Luccioni, a researcher at French-American AI firm Hugging Face, is that companies have become more secretive as AI has become profitable.
Go back just a few years and firms like OpenAI would publish details of their training regimes â what hardware and for how long. But the same information simply doesnât exist for the latest models, like ChatGPT and GPT-4, says Luccioni. âWith ChatGPT we donât know how big it is, we donât know how many parameters the underlying model has, we donât know where itâs running ⦠It could be three raccoons in a trench coat because you just donât know whatâs under the hood.â
Quantifying the energy consumption of artificial intelligence (AI) is a complex task, as the process involves numerous variables and lacks comprehensive data from the companies behind the technology. While estimates exist, they only offer a partial view of AI’s overall energy usage, leaving much to be uncovered.
One key aspect to consider is the disparity between training AI models and deploying them for use. Training models, in particular, demand a significant amount of energy, surpassing the electricity consumption of conventional data center activities. For instance, training a large language model like GPT-3 is estimated to consume nearly 1,300 megawatt hours (MWh) of electricity, equivalent to the annual power consumption of 130 US homes. To put this into perspective, streaming an hour of Netflix requires only about 0.8 kWh (0.0008 MWh) of electricity, meaning one would need to watch over 1.6 million hours to match the energy used in training GPT-3.
However, translating such figures to current state-of-the-art systems poses challenges. The energy consumption could potentially be higher as AI models have been increasing in size over the years, demanding more energy. Conversely, companies may be implementing strategies to enhance energy efficiency, thus mitigating the upward trajectory of energy costs.
What is the energy consumption of AI?
The difficulty in obtaining up-to-date estimates stems from the growing secrecy surrounding AI practices as they become more commercially viable. While companies previously disclosed details of their training processes, such transparency is lacking for newer models like ChatGPT and GPT-4. This lack of transparency makes it challenging to accurately assess the energy footprint of these advanced AI systems.
Sasha Luccioni, a researcher at Hugging Face, highlights the opacity surrounding current AI models, citing the absence of information regarding their size, parameters, and operational details. This secrecy underscores the challenge in understanding the true energy impact of contemporary AI technologies.