this post was submitted on 26 Feb 2026
398 points (99.0% liked)

Technology

81933 readers
2962 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] fruitycoder@sh.itjust.works 1 points 7 hours ago (1 children)

GPGPU usage is probally going to see some real usage. There was an interesting talk at the xorg conf even about turn the video hardware into virtual services running on GPGPU focused hardware.

Ive talked with some of the HPC programers too who are trying to find creative repurposes already lol

[โ€“] tal@lemmy.today 1 points 3 hours ago* (last edited 3 hours ago)

I think that it's fair to say that AI is not the only application for that hardware, but I also think that carpelbridgesyndrome's point was that they aren't really well-suited to replace conventional servers, where all local computing just moves to a server, which is the sort of thing that ouRKaoS was worried about. Maybe for some very specialized use cases, like cloud gaming in some genres. I'd also add that the physical buildings have way more cooling capacity than is necessary for conventional servers, so they probably wouldn't be the most-cost-effective approach even if you replaced the computing hardware in the buildings.