this post was submitted on 19 Dec 2025
366 points (98.2% liked)

Fuck AI

5004 readers
832 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

In a study published on Monday by the peer-reviewed journal Patterns, data scientist Alex de Vries-Gao estimated the carbon emissions from electricity used by AI at between 33 million and 80 million metric tons.

That higher figure would put it above last year's totals for Chile (78m tons), Czechia (78m tons), Romania (71m tons), and New York City (48m tons, including both CO2 and other greenhouse gases).

you are viewing a single comment's thread
view the rest of the comments
[–] Blaster_M@lemmy.world -1 points 1 week ago* (last edited 1 week ago) (4 children)

Last time I saw numbers, the power requirements of 100,000 ChatGPT responses equates to the same energy usage at the server end as one person watching 1 hour of Netflix.

A gaming PC typically has a GPU that pulls between 200-300W when running a AAA game, plus the 90W for the CPU being stressed, plus another 100 for other system components. Add 45-60W for your monitor as well.

Gaming takes a lot of power.

Running a Local LLM using the same GPU, you need about 10 seconds of just the GPU and supporting haddware energy, versus how many hours you're running a game. Gaming is more environmentally damaging that AI is.

The reason these big scary numbers are here is because all that energy usage is collected in one spot. If we added up everyone's individual gaming habits, it might make datacenter energy usage look small in comparison.

The only real difference is the datacenters using open liquid cooling instead of air or closed loop, which the latter two are much more environmentally ideal.

[–] snooggums@piefed.world 16 points 1 week ago (1 children)

Last time I saw numbers, the power requirements of 100,000 ChatGPT responses equates to the same energy usage at the server end as one person watching 1 hour of Netflix.

They use the majority of water during the training phase, but present only usage numbers for people to fall for like you are doing right here.

That is like only counting the time spent by a delivery driver walking packages to a house and ignoring all of the time spent getting it to the delivery company, sorting it, driving it to the airport, flying it to another city, driving it to the distribution center, sorting it again, and then driving it to your house. Sure, if you only count the time delivery people spent walking to houses it isn't that much time at all!

[–] brucethemoose@lemmy.world -1 points 1 week ago* (last edited 1 week ago)
  • For sane models, that’s way overstated. Stuff like GLM 4.6 or Kimi K2 is trained on peanuts, and their inference GPU time blows it away.

  • I have not checked on the latest OpenAI/Grok training cost claims. But if any company is spending tens of millions (or hundreds?) on a single training run… that’s just stupid. It means they’re burning GPUs ridiculously inefficiently, for the sake of keeping up appearances. Llama 4 rather definitively proved that scaling up doesn’t work.

The hype about ever increasing training costs is a grift to get people to give Sam Altman money. He doesn’t need that for the architectures they’re using, and it won’t be long before everyone figures it out and switches to cheaper models for most usage.

[–] cron@feddit.org 5 points 1 week ago

Those numbers comparing ChatGPT with Netflix do not seem plausible to me.

Streaming a video is basically sending a file over the internet, while ChatGPT requires multiple GPUs to run.

I found different numbers online:

  • ChatGPT: 0.3 to 3 Wh per query
  • Netflix: 77 Wh per hour (source)

By this rough calculation, your estimate is quite a few magnitudes wrong. It is maybe 100-500 ChatGPT queries that are equal to watching Netflix, not 100.000.

[–] AmbiguousProps@lemmy.today 5 points 1 week ago

It's the training that consumes the electricity, not the outright usage.

[–] bridgeenjoyer@sh.itjust.works -5 points 1 week ago (1 children)

Exactly. Id rather focus on the dumbing down of society and destruction of the internet (not to mention facism and mass surveillance) from ai rather than nitpick environmental concerns no one cares about.

[–] Catoblepas@piefed.blahaj.zone 5 points 1 week ago (1 children)

‘Who cares that the house is on fire? My internet is crap!’ is exactly the dumbing down of society you’re talking about. You can’t use the internet if the planet doesn’t support human life.

[–] bridgeenjoyer@sh.itjust.works -1 points 1 week ago

Very true. Im just saying there's a lot of other much worse environmental concerns to go after. Harping on ai for this reason just gets us laughed at as old man yelling at cloud.