Environment & Energy
Related: About this forumAdvanced AI models generate up to 50 times more CO₂ emissions than more common LLMs when answering the same questions
By Ben Turner published 2 days ago
Asking AI reasoning models questions in areas such as algebra or philosophy caused carbon dioxide emissions to spike significantly.
The more accurate we try to make AI models, the bigger their carbon footprint with some prompts producing up to 50 times more carbon dioxide emissions than others, a new study has revealed.
Reasoning models, such as Anthropic's Claude, OpenAI's o3 and DeepSeek's R1, are specialized large language models (LLMs) that dedicate more time and computing power to produce more accurate responses than their predecessors.
Yet, aside from some impressive results, these models have been shown to face severe limitations in their ability to crack complex problems. Now, a team of researchers has highlighted another constraint on the models' performance their exorbitant carbon footprint. They published their findings June 19 in the journal Frontiers in Communication.
"The environmental impact of questioning trained LLMs is strongly determined by their reasoning approach, with explicit reasoning processes significantly driving up energy consumption and carbon emissions," study first author Maximilian Dauner, a researcher at Hochschule München University of Applied Sciences in Germany, said in a statement. "We found that reasoning-enabled models produced up to 50 times more CO₂ emissions than concise response models."
Snip...
https://www.livescience.com/technology/artificial-intelligence/advanced-ai-reasoning-models-o3-r1-generate-up-to-50-times-more-co2-emissions-than-more-common-llms

OKIsItJustMe
(21,458 posts)Where is the comparison to conventional search results which employ no AI whatsoever?
highplainsdem
(57,433 posts)of earlier search. July of last year:
https://www.npr.org/2024/07/12/g-s1-9545/ai-brings-soaring-emissions-for-google-and-microsoft-a-major-contributor-to-climate-change
Assuming that was correct, the newer AI models - which also hallucinate more - could be using hundreds of times more electricity than conventional, pre-genAI search.
And the AI companies have made it clear they're desperate for more data centers and power for them. Musk is violating laws powering his Memphis data center. Article published last month:
https://www.nbcnews.com/news/us-news/naacp-memphis-musk-xai-colossus-rcna208589
OKIsItJustMe
(21,458 posts)How Much Energy Does AI Use? The People Who Know Arent Saying
A growing body of research attempts to put a number on energy use and AIeven as the companies behind the most popular models keep their carbon emissions a secret.
As a result of this lack of transparency, Luccioni says, the public is being exposed to estimates that make no sense but which are taken as gospel. You may have heard, for instance, that the average ChatGPT request takes 10 times as much energy as the average Google search. Luccioni and her colleagues track down this claim to a public remark that John Hennessy, the chairman of Alphabet, the parent company of Google, made in 2023.
A claim made by a board member from one company (Google) about the product of another company to which he has no relation (OpenAI) is tenuous at bestyet, Luccionis analysis finds, this figure has been repeated again and again in press and policy reports. (As I was writing this piece, I got a pitch with this exact statistic.)
People have taken an off-the-cuff remark and turned it into an actual statistic thats informing policy and the way people look at these things, Luccioni says. The real core issue is that we have no numbers. So even the back-of-the-napkin calculations that people can find, they tend to take them as the gold standard, but thats not the case.
Perhaps most crucially for our understanding of AIs emissions, open source models like the ones Dauner used in his study represent a fraction of the AI models used by consumers today. Training a model and updating deployed models takes a massive amount of energyfigures that many big companies keep secret. Its unclear, for example, whether the light bulb statistic about ChatGPT from OpenAIs Altman takes into account all the energy used to train the models powering the chatbot. Without more disclosure, the public is simply missing much of the information needed to start understanding just how much this technology is impacting the planet.
highplainsdem
(57,433 posts)tech is very bad for the environment.
And I would never trust people like Elon Musk or Sam Altman to give honest answers.
Or anyone in the Trump admin, for that matter.
SheltieLover
(71,884 posts)