In a recent thematic investing report, analysts at Barclays have underscored the burgeoning energy demands set to accompany the rapid advancement of artificial intelligence (AI) technologies, with particular emphasis on NVIDIA‘s pivotal role in this evolving landscape.
According to Barclays, the projected surge in energy requirements linked to AI progress represents a critical aspect of NVIDIA’s market trajectory. Analysts foresee data centers consuming more than 9% of current U.S. electricity demand by 2030, largely driven by the escalating power needs of AI applications. This “AI power baked into NVIDIA consensus” stands out as a key factor influencing this significant energy forecast.
Despite improvements in AI efficiency with each GPU generation, the complexity and size of AI models, such as large language models (LLMs), are expanding rapidly. These models, increasing approximately 3.5 times in size annually, demand substantial computational power.
Barclays highlights NVIDIA’s efforts in developing more energy-efficient GPUs like the Hopper and Blackwell series. However, the growing complexity of AI models necessitates increased computational resources, driving up overall energy consumption.
“Large language models require immense computational power for real-time performance,” notes the report, attributing higher energy consumption to the escalating demands on memory, accelerators, and servers necessary for training and deploying these models.
Barclays projects that powering around 8 million GPUs will necessitate approximately 14.5 gigawatts of power, translating to about 110 terawatt-hours (TWh) of energy, assuming an 85% average load factor. With a significant portion of these GPUs expected to be deployed in the U.S. by 2027, the energy demand could exceed 10 gigawatts and 75 TWh within three years.
“NVIDIA’s market cap suggests this is just the beginning of AI power demand deployment,” Barclays analysts predict. The ongoing development and deployment of GPUs by NVIDIA are poised to substantially escalate energy consumption across global data centers.
Furthermore, the reliance of data centers on grid electricity underscores the need to manage peak power demands effectively. Continuous operation mandates a stable and robust power supply infrastructure.
The report concludes by citing Sam Altman, CEO of OpenAI, who highlighted at the Davos World Economic Forum the underestimated energy requirements of AI technologies: “We do need way more energy in the world than I think we thought we needed before…I think we still don’t appreciate the energy needs of this technology.”
Related topics:
Google Data and Machine Learning: Transforming Insights into Innovations