A new report reveals that ChatGPT exorbitantly consumes 17,000 times more electricity compared to the average US household per day

Will there be enough electricity in the foreseeable future to power AI advances with its exponential growth and rapid adoption worldwide?

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

What you need to know

What you need to know

A new report indicates thatChatGPTconsumes up to 17,000 times more electricity than the average US household uses daily (viaThe New Yorker). Elon Musk recently indicated we’re on the verge of the biggest technology revolution with AI, butthere won’t be enough electricity by 2025to power these advancements.

Putting this into perspective, The New Yorker indicates that an average US household consumes up to 29 kilowatt-hours daily. This essentially means that 493,000 kilowatt-hours daily. With the exponential growth, advances, and rapid adoption ofgenerative AIworldwide, things could go from bad to worse.

According to aresearch paperpublished by Dutch National Bank data scientist Alex de Vries, if at one point Google decides to integrate AI into every search, this would ultimately lead to a surge in the amount of electricity the technology consumes pushing it to approximately twenty-nine billion kilowatt-hours per year. The data scientist indicates that this is well beyond the amount of electricity consumed by most countries, including Kenya, Guatemala, and Croatia.

While speaking toBusiness Insider, de Vries indicated:

“AI is just very energy intensive. Every single of these AI servers can already consume as much power as more than a dozen UK households combined. So the numbers add up really quickly.”

Per de Vries' estimations (if all factors are held constant), by 2027 the AI sector could potentially be at a point where it consumes anything from 85 to 134 terawatt-hours annually. While speaking toThe Verge, the data scientist added that:

“You’re talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027. I think that’s a pretty significant number.”

Get the Windows Central Newsletter

Get the Windows Central Newsletter

All the latest news, reviews, and guides for Windows and Xbox diehards.

Concerns are well beyond power consumption

ChatGPTis arguably one of the best AI-powered chatbots right now. A recent report disclosed thatthe AI assistant dominates the mobile market shareeven afterMicrosoft shipped Copilot AI to iOS and Android userswith free access toOpenAI’s GPT-4 modelandDALL-E 3 image generation technology.

The chatbot has achieved incredible feats, includingdeveloping software in under 7 minutes,generating free Windows keys, andbypassing paywalled information. While impressive, all these come at an exorbitant cost.

OpenAIuses up to $700,000 daily to keep ChatGPT running, which could be a possible explanation for the reports looming the year indicating thatthe ChatGPT-maker is on the verge of bankruptcy. This is amid reports thatthe chatbot is losing accuracyand is seeminglygetting dumber.

Last year, new research showed thatMicrosoft Copilot and ChatGPT could consume enough to power a small country for a year by 2027. This is on top of the exorbitant cost of thehigh demand for cooling water, as the chatbot consumes one water bottle for cooling per query.

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You’ll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.