March 13, 2026 Artificial intelligence systems such as ChatGPT consume significantly different amounts of energy depending on the complexity of the task, according to a new investigation by MIT Technology Review. The report estimates that generating a single response from a large language model can require between 114 and 6,706 joules of energy.
The analysis highlights how different types of generative AI workloads vary widely in power consumption. Simpler models typically use fewer parameters and require less electricity but may produce less accurate results.
The gap becomes far larger when AI generates multimedia content. The report found that producing a five-second AI-generated video can require about 3.4 million joules of energy, more than 700 times the electricity needed to generate a high-quality image.
Researchers also calculated the potential cost of typical AI usage. A scenario involving 15 chatbot prompts, 10 generated images and three short videos would consume roughly 2.9 kilowatt-hours of electricity — about the same amount of power used by a microwave operating for more than three and a half hours.
The report also examined the growing electricity demands of data centres that power modern AI systems. Historically, data centre energy consumption had remained relatively stable because improvements in hardware efficiency offset rising demand for cloud computing.
That trend has shifted with the rapid adoption of generative AI. Electricity consumption by U.S. data centres has roughly doubled since 2017, according to the report.
Government data cited in the analysis suggests AI workloads will continue driving that growth. By 2028, about half of all electricity used by data centres in the United States is expected to power artificial intelligence systems.
The findings arrive as technology companies expand AI features across consumer and enterprise products. Generative tools are increasingly integrated into search engines, productivity software and online services.
