In 1903, Mark Twain wrote that “It takes a thousand men to invent a telegraph, or a steam engine, or a phonograph, or a photograph, or a telephone or any other important thing.” This observation still mostly holds true. The invention of artificial intelligence required decades of work by thousands of scientists, engineers, and industry leaders. It will require many more men and women to develop the technology in the years ahead.
As the march of AI accelerates, a new requirement has become apparent: the next breakthroughs will consume colossal quantities of energy. AI guzzles electricity — a single ChatGPT query requires 10 times as much as a conventional web search. As AI usage increases, its energy requirements will rise, and if demand outstrips supply, the technology’s development will be strangled.
The data centers that underpin AI development at scale — powering GPT-4, Gemini and other frontier models — need around-the-clock access to power. They already account for roughly 3% of annual U.S. electricity consumption, and this share is expected to more than double in the next five to 10 years. More broadly, AI’s electricity usage is projected to increase from four terawatt-hours in 2023 to 93 TWh in 2030 — more than Washington State used in 2022. And that’s a conservative estimate; AI could consume this much power as early as 2025.
With your current subscription plan you can comment on stories. However, before writing your first comment, please create a display name in the Profile section of your subscriber account page.