this post was submitted on 02 Apr 2026
714 points (99.6% liked)
Fuck AI
6576 readers
2754 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's a popular take, especially around here, but AI does have some pretty nice use cases; just not as many as the TechBros would have you believe.
Here's some examples I've personally seen in the last 14 days:
Does all of the "Agentic" Woo Woo shit work? No, it absolutely doesn't but it is clearly getting better as time goes on.
IMO this whole AI thing has some very strong parallels to the early '80s computer industry. Right now it often requires specialist knowledge for good results which makes it clunky to use, it is somewhat slow, there's very little interoperability, and it requires enormous amounts of power. Hell even this "over buying hardware" schtick fits right in, this happened with SRAM and then several times with DRAM as the industry matured.
However the industry is also making progress at almost insane speed; not only is the output getting demonstrably better but the negatives are being addressed. In the past 30 days I've seen prototype ASIC-esque hardware that works in a standard desktop PC and processes nearly 10,000 tokens a second with local processing.
The only reason you're not seeing that kind of kit in the market yet is because the models are still changing too much and no one wants to commit hundreds of millions to making cards that would be outdated before they could be shipped. We're probably only 18-24 months away though.
I've also seen 10x improvements in memory usage (TurboQuant) and literally dozens of little tweaks and tricks to reduce footprint and speed processing. Just like what was going on in the PC industry in the '80s and '90s.
So sure, Fuck AI (mostly) as it exists today but it won't be long before it's as ubiquitous as tablets and smartphones.
And we're fucking the world up to... transcribe meetings?
No, it's to make the rich richer.
Many people do not think about what or why they are doing what they do, or what its end outcome will be.
I don't think you get why I don't want AI.
All the things you mentioned that AI is good at? Thats a bad thing to have. The more the technology becomes better, the worse all of our lives become.
AI will steal all jobs. ALL jobs. Even the prostitutes. Whatever your job is, AI within 10 years will do it better than you at a fraction of your cost. Basically for free. And you can't get another job, because ALL jobs are AI now. Build a robot, slap some AI in it, connect it to the main server, and it now has access to every AI units databases.
And then what about us? Well, the wealthy become the overlords, and we become the slaves.
How is AI gonna replace prostitutes? Maybe porn will shift to AI-generated stuff therefore reducing the number of porn actresses, but actual prostitutes? No way. How is AI gonna replace physical touch?
I think the only industry that's actually safe at this time is psychology. Therapy and mental health is bigger now than before. Plus it requires a real comprehensive understanding of the human experience that's simply impossible for AI to do effectively with positive results.
There probably be attempts though, I do think it'll be ruled as highly illegal.
I actually agree with 99% of what you wrote, but you are a bit optimistic in one regard: they will want some sex slaves, but most of us will be food.
Whoa whoa, has "eat the rich" been one of those situations where the hyphen/comma is in the wrong place?
It's really "Eat, the rich!"
You could be right, only time will tell.
In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO. The true lasting effects from this hype cycle are likely the capabilities that are being driven into smaller language models that don't have out of control resource requirements.
I agree, which is why I shared that I recently saw a prototype ASIC-esque PCI card. The local hardware is coming, the models just need to settle down some before anyone will commit to building that hardware.
In the '90s and '00s you needed a zillion dollars of custom Silicon Graphics workstations and months of processing to do the FX for movies like "The Terminator". In 2020 you could replicate it in a few hours with commodity hardware.
The LLMs and AI will be the same, it just needs more than 5 years to get there.
Yeah if you can run them locally using a small board, that'll last.
LLMs as they are, can already run on smartphones, which pretty are ubiquitous themselves.
So a flagship phone would have 12-16 gigs of RAM these days I believe. A low-end phone 4 gigs.
Here are the sizes of some different parameter count versions of Qwen 3.5, a popular Chinese open-weight LLM:
27B: 17 GB - not yet possible to run on current flagship phones, but once the RAM crisis ends, I could see this happening.
9B: 6.6 GB
4B: 3.4 GB
2B: 2.7 GB
0.8B: 1 GB.
For any recently manufactured device, there will be versions of multiple popular LLMs that will run on the RAM size they have available.
Most people do not have a smartphone with that amount of RAM. But ultimately, yeah, eventually it'll run on readily available hardware or it'll go into a dustbin.
There's already ollama and stuff. It'll stick around.
I mean fairly low end phones are 4 GB now. They could likely afford running a model that fits in 1GB of RAM. Different models for different classes of phone even for the same manufacturer will likely be a thing.