this post was submitted on 19 Dec 2025
366 points (98.2% liked)

Fuck AI

5004 readers
787 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

In a study published on Monday by the peer-reviewed journal Patterns, data scientist Alex de Vries-Gao estimated the carbon emissions from electricity used by AI at between 33 million and 80 million metric tons.

That higher figure would put it above last year's totals for Chile (78m tons), Czechia (78m tons), Romania (71m tons), and New York City (48m tons, including both CO2 and other greenhouse gases).

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world -1 points 1 week ago* (last edited 1 week ago)
  • For sane models, that’s way overstated. Stuff like GLM 4.6 or Kimi K2 is trained on peanuts, and their inference GPU time blows it away.

  • I have not checked on the latest OpenAI/Grok training cost claims. But if any company is spending tens of millions (or hundreds?) on a single training run… that’s just stupid. It means they’re burning GPUs ridiculously inefficiently, for the sake of keeping up appearances. Llama 4 rather definitively proved that scaling up doesn’t work.

The hype about ever increasing training costs is a grift to get people to give Sam Altman money. He doesn’t need that for the architectures they’re using, and it won’t be long before everyone figures it out and switches to cheaper models for most usage.