Scroll Top

Google’s AI alone might be using more energy than all of Ireland

WHY THIS MATTERS IN BRIEF

Training and running giant AI models like GPT4 and BARD use huge amounts of energy, and it’s all starting to add up.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Artificial Intelligence (AI) powered systems not only consume huge amounts of data for training purposes but also require tremendous amounts of electricity to run on, so says a new study which calculated the energy use and carbon footprint of several recent large language models.

 

See also
Michigan to build America's first wireless electric vehicle charging road

 

One of them, ChatGPT, running on 10,000 NVIDIA GPUs, was found to be consuming 1,287 megawatt hours of electricity – the equivalent of energy used by 121 homes for a year in the United States.

As we accelerate towards building one of the greatest technological developments man has ever achieved, we need to ask ourselves, what is the offset of this development? In a commentary published in the journal Joule, author Alex de Vries argues that in the future, the energy demands to power AI tools may exceed the power demands of some small nations.

There has recently also been a shift in more companies developing their own chips to meet the heavy AI requirements. Google and Amazon already have their own AI chips, whereas rumors are rife that Microsoft will be unveiling its in-house chip hardware next month. Microsoft also has heavy investments in OpenAI, which, as per reports, is also in the beginning stages of either developing its own chips or acquiring a semiconductor company that does it for them.

 

See also
New patent shows Microsoft wants to resurrect you as a chatbot

 

All this means is that there will be a significant rise in the energy footprint of the AI industry – that is at least until new analogue chips and chip designs that are up to 100x more energy efficient are commercialised.

Vries explains, “For example, companies such as Alphabet’s Google could substantially increase their power demand if generative AI is integrated into every Google search.”

As per the leading semiconductor and AI blog SemiAnalysis, it is estimated that integrating a ChatGPT-like chatbot with each Google search would require 512,820 of NVIDIA’s A100 HGX servers, which means over 4 million GPUs. At a power demand of 6.5 kW per server, this would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh.

 

See also
The Pentagon wants to test a space based particle beam weapon by 2023

 

The author noted that AI tools have an initial training phase followed by an inference phase. The training phase is the most energy-intensive and has been the center of AI sustainability research done thus far. The inference phase is when these tools generate output based on the data they are trained on. The author has called on the scientific community to pay more attention to this phase.

“…OpenAI required 3,617 of NVIDIA’s HGX A100 servers, with a total of 28,936 GPUs, to support ChatGPT, implying an energy demand of 564 MWh per day,” said Vries. And this is just to get the chatbot started before any consumer even started using it. “Compared to the estimated 1,287 MWh used in GPT-3’s training phase, the inference phase’s energy demand appears considerably higher,” he added.

 

See also
Here's how NASA plans to build on Mars and the Moon

 

The author lastly noted that it is too optimistic to expect that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption. But efforts are being made.

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This