this post was submitted on 24 Feb 2026
341 points (98.3% liked)

Fuck AI

6116 readers
1461 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

I stole this fair and square. Hope this hasn't been posted yet.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] WolfLink@sh.itjust.works 10 points 3 days ago* (last edited 3 days ago) (1 children)

Running an AI on a GPU requires enough VRAM to fit the model, otherwise it will fall back to the CPU which is very slow. Mac Minis share RAM between the CPU and GPU, and you can get a Mac mini with a lot of shared RAM for a lot cheaper than a GPU with a lot of VRAM.

[โ€“] underscores@lemmy.zip 2 points 3 days ago

Ty this is the answer I was looking for