32 C
Bangkok
Sunday, September 21, 2025

The Dark Side of AI: GPT-5’s Energy Demands Exposed

OpenAI’s latest model, GPT-5, is a testament to the rapid advancements in artificial intelligence. Its ability to create websites and answer complex scientific questions is remarkable. However, the model’s release has also brought to light a significant and often overlooked issue: its massive energy consumption. With no official data from the company, independent researchers are shining a light on the model’s environmental footprint, revealing a dramatic increase in power usage that raises alarms about the future of sustainable technology.
The numbers are a wake-up call. Researchers at the University of Rhode Island’s AI lab have found that a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a substantial increase from previous models and is “significantly more energy than GPT-4o,” according to a researcher in the group. To put this in perspective, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. Considering that ChatGPT handles billions of requests daily, the total energy consumption could reach the daily electricity demand of 1.5 million US homes, a staggering figure that underscores the scale of the problem.
This surge in power consumption is directly tied to the model’s size. Although OpenAI has not released the parameter count for GPT-5, experts widely believe it to be “several times larger than GPT-4.” This is consistent with a study from the French AI company Mistral, which found a “strong correlation” between a model’s size and its resource consumption. The study concluded that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models, a stated goal of companies like OpenAI, will continue to drive up resource usage at an alarming rate.
The new capabilities of GPT-5 also play a significant role in its high energy demands. Its advanced “reasoning mode” and ability to process video and images require more intensive computation than simple text generation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. This raises serious questions about the long-term sustainability of the AI industry and the need for greater transparency.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles