this post was submitted on 22 Feb 2026
521 points (99.6% liked)

Not The Onion

20526 readers
1919 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

“But it also takes a lot of energy to train a human,” Altman said. “It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you.”

So in his view, the fair comparison is, “If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way.”

you are viewing a single comment's thread
view the rest of the comments
[–] MightBeAlpharius@lemmy.world 1 points 5 hours ago

Okay, so I'm not a big AI guy. It kind of sucks at everything we try to do with it, and it's basically a huge waste of resources right now.

But... Sometimes it's fun to play devil's advocate.

AI consumes shitloads of electricity and water, and produces nothing but slop. Even if they're not using evaporative cooling, that water use impacts the availability of usable water downstream of the data center. Also, it's a huge money pit - last I saw, AI companies weren't really turning a profit.

The article addresses electricity (Altman specifically called out a pivot to nuclear, wind, and solar), but doesn't say a ton about the other issues... Which could all be addressed with coastal data centers.

Don't worry - I'm not about to suggest hearing the ocean up to cool data centers. Instead, why not pivot back to evaporative cooling, but with seawater?

Build the data center, and put some cooling pools around it - twelve seems like a good number. Make the pools big enough that the center can be cooled without the use of all of the pools (this is important). Heat sinks are made of metal, and saltwater is bad for most metals, so slap on a few sacrificial anodes like they're metal-hulled boats. Boom - the data center is now cooled using non-potable water without warming the ocean.

Now, as water evaporates, salt deposits will form in the cooling pools. When a pool gets too salty, it can be drained (or allowed to fully evaporate), and the salt can be knocked off and collected. Boom - losses reduced, data center is now a salt farm. Salt's not really worth much, but it could probably be marked up and sold to tech bros as fancy "AI powered sea salt."

And then, once we've done that, we can train the AI to do something useful, like... Uh... Clean it's own salt pools with a little robot, I guess; it kind of sucks at everything important.