this post was submitted on 02 Apr 2026
635 points (99.5% liked)

Fuck AI

6576 readers
1819 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Lost_My_Mind@lemmy.world 121 points 22 hours ago (4 children)

Ok, so they bought billions of dollars of ram/storage, to put inside servers that haven't been bought yet, to put inside data centers that haven't been built yet, in order to run AI that doesn't work yet, in order to chase profits that are impossible to achieve.

And now, despite driving ram prices up to absurd prices, you've begun to realize the same thing all of us knew from before day one. NOBODY WANTS THIS SHIT!!!

[–] dylanmorgan@slrpnk.net 44 points 20 hours ago (3 children)

None of that RAM (or the GPUs) have been purchased. All that is just letters of intent or even flimsier agreements, there’s no contracts or actual money changing hands.

[–] Canconda@lemmy.ca 47 points 19 hours ago

Doesn't matter, they've captured the entire supply chain which was their goal.

This is not about AGI... its about monopolizing the future of computing.

[–] humanspiral@lemmy.ca 8 points 15 hours ago (1 children)

the good news for RAM prices is when OpenAI makes money by either reselling "contracts", or cancelling/getting cancelled their letters of intent to make ddr5 instead,

[–] Napster153@lemmy.world 7 points 14 hours ago

We'd hope but you just know they'll try and rob us blind still somehow. Intelligence is second from the bottom to these people.

[–] InputZero@lemmy.world 19 points 19 hours ago

All that is just letters of intent or even flimsier agreements, there’s no contracts or actual money changing hands.

Not quite. So while none of it has been made the pre-production, procurement, scheduling machine time, that is what's going to make retooling to make consumer RAM take forever. TSMC or whomever can't just flip a switch and produce a different product. It takes weeks to months to change over production that complicated. Money will change hands, work has already been done and agreed upon.

[–] Canconda@lemmy.ca 16 points 19 hours ago* (last edited 19 hours ago) (2 children)

Yup and this is all going according to plan.

  1. Corner the Market

  2. Raise prices

  3. Sell High - The bubble will not burst until they're ready to leave us holding the bag. The burst will be triggered by the sell off.

  4. Buy low - All these assets will be liquidated during bankruptcy for pennies on the dollar; to the same shareholders as before.

  5. Rinse

  6. Repeat

  7. Fuck you (and me)

[–] Buelldozer@lemmy.today 10 points 18 hours ago (1 children)

Congratulations, you've just covered how the Computer / Tech Industry has worked since mainframes were invented. It's a constant cycle of $NewThing that almost works, desperate effort by a lot of companies to make it work right / better, market cornering, BoomTime for a lucky few companies, then someone figures out how to do it cheaper or re-focus the market on something slightly different, then BustTime.

[–] brownsugga@lemmy.world 1 points 3 minutes ago

It jus computer/tech but pretty much all financial markets

[–] jello8_@lemmy.today 1 points 11 hours ago (1 children)

You forgot to throw a couple bailouts in there and some regulatory capture mandating AI slop in you cars or something for "safety."

[–] Tollana1234567@lemmy.today 1 points 9 hours ago

they are peddling AI so hard as a surveillence tech for most government, because they think that is where constant revenue stream is.the~___~

[–] bridgeenjoyer@sh.itjust.works 19 points 21 hours ago* (last edited 21 hours ago) (4 children)

I agree with you completely, but,

I wouldn't say "no one wants this" though. The oligarchs have poured in billions and bought off every media company to constantly spout off about ai companies so your general normie thinks its "the future". Almost every single (normie) person I know (except 1 who is anti AI, and he's a geek) is using some form of slopbot for tons of things. Easy excel formulas (that anyone can do), turning pictures black and white (that literally any photo program has been able to do for 30+ years) , to summarize documents (because people are idiots now and have no reading comprehension) etc. The normies LOVE it and eat up the slop. Especially if they were stupid at computers before, now they think they're on the level of woz because they told a chatbot to make slop code.

The company I'm in can't go 3 seconds without bringing up "ai innovation" and "being future ready".

Its only here on Lemmy that people dislike it. The rest of the world is already addicted, and we are screwed.

[–] cynar@lemmy.world 22 points 20 hours ago (1 children)

I've seen quite a few people who make casual use of it. The key point is that it is currently free to them. As soon as it starts costing money, a lot will bail on it.

[–] OpenStars@piefed.social 3 points 16 hours ago

Several people I know have decided to already start paying for it.

[–] SalamenceFury@piefed.social 10 points 20 hours ago (2 children)

The only people who are excited for AI are twitter users and rich people. It's not just in Lemmy.

[–] bridgeenjoyer@sh.itjust.works 2 points 16 hours ago

Haha so right on the Twitter users. Funny how much correlation there is. Low intelligence - xitter - sloperator user.

[–] MrKoyun@lemmy.world 1 points 16 hours ago

People dont need to be excited for it to want & use it.

[–] ragas@lemmy.ml 3 points 16 hours ago (1 children)

Hmm it is different in my bubble. Most of the people I know use AI sparingly and generally do not trust the results without checking.

[–] Tollana1234567@lemmy.today 1 points 9 hours ago

i used it for the first time a month ago, it does not give even correct info, it just Assume what it sees from other sites, it doesnt have "checks" to see which ones are comments, post or blogs over official info.

[–] OpenStars@piefed.social 2 points 16 hours ago

thinks its “the future”

Sort of, yeah. The thing is... it IS the future, whether we like it or not... it's just not the PRESENT.

[–] Buelldozer@lemmy.today -2 points 18 hours ago* (last edited 18 hours ago) (3 children)

NOBODY WANTS THIS SHIT!!!

That's a popular take, especially around here, but AI does have some pretty nice use cases; just not as many as the TechBros would have you believe.

Here's some examples I've personally seen in the last 14 days:

  1. It's good at transcribing meetings, including picking out who is talking, backing into an agenda, and highlighting action items.
  2. It's darn good at writing even moderately complex scripts in any of the common languages. (Powershell, Python, R, etc)
  3. In the right hands (fingers?) it's getting increasingly good at finding and exploiting security flaws.
  4. It's amazing at slicing and dicing data if the person using it knows what they're doing.

Does all of the "Agentic" Woo Woo shit work? No, it absolutely doesn't but it is clearly getting better as time goes on.

IMO this whole AI thing has some very strong parallels to the early '80s computer industry. Right now it often requires specialist knowledge for good results which makes it clunky to use, it is somewhat slow, there's very little interoperability, and it requires enormous amounts of power. Hell even this "over buying hardware" schtick fits right in, this happened with SRAM and then several times with DRAM as the industry matured.

However the industry is also making progress at almost insane speed; not only is the output getting demonstrably better but the negatives are being addressed. In the past 30 days I've seen prototype ASIC-esque hardware that works in a standard desktop PC and processes nearly 10,000 tokens a second with local processing.

The only reason you're not seeing that kind of kit in the market yet is because the models are still changing too much and no one wants to commit hundreds of millions to making cards that would be outdated before they could be shipped. We're probably only 18-24 months away though.

I've also seen 10x improvements in memory usage (TurboQuant) and literally dozens of little tweaks and tricks to reduce footprint and speed processing. Just like what was going on in the PC industry in the '80s and '90s.

So sure, Fuck AI (mostly) as it exists today but it won't be long before it's as ubiquitous as tablets and smartphones.

[–] MrKoyun@lemmy.world 10 points 16 hours ago (1 children)

And we're fucking the world up to... transcribe meetings?

[–] OpenStars@piefed.social 3 points 16 hours ago

No, it's to make the rich richer.

Many people do not think about what or why they are doing what they do, or what its end outcome will be.

[–] Lost_My_Mind@lemmy.world 9 points 18 hours ago (3 children)

I don't think you get why I don't want AI.

All the things you mentioned that AI is good at? Thats a bad thing to have. The more the technology becomes better, the worse all of our lives become.

AI will steal all jobs. ALL jobs. Even the prostitutes. Whatever your job is, AI within 10 years will do it better than you at a fraction of your cost. Basically for free. And you can't get another job, because ALL jobs are AI now. Build a robot, slap some AI in it, connect it to the main server, and it now has access to every AI units databases.

And then what about us? Well, the wealthy become the overlords, and we become the slaves.

[–] SummerReaper@lemmy.world 1 points 8 hours ago

I think the only industry that's actually safe at this time is psychology. Therapy and mental health is bigger now than before. Plus it requires a real comprehensive understanding of the human experience that's simply impossible for AI to do effectively with positive results.

There probably be attempts though, I do think it'll be ruled as highly illegal.

[–] lemmy_outta_here@lemmy.world 2 points 17 hours ago (1 children)

I actually agree with 99% of what you wrote, but you are a bit optimistic in one regard: they will want some sex slaves, but most of us will be food.

[–] Zink@programming.dev 3 points 15 hours ago

Whoa whoa, has "eat the rich" been one of those situations where the hyphen/comma is in the wrong place?

It's really "Eat, the rich!"

[–] Buelldozer@lemmy.today -1 points 17 hours ago

You could be right, only time will tell.

[–] aesthelete@lemmy.world 3 points 18 hours ago (2 children)

So sure, Fuck AI (mostly) as it exists today but it won’t be long before it’s as ubiquitous as tablets and smartphones.

In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO. The true lasting effects from this hype cycle are likely the capabilities that are being driven into smaller language models that don't have out of control resource requirements.

[–] Buelldozer@lemmy.today 2 points 17 hours ago* (last edited 17 hours ago) (1 children)

In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

I agree, which is why I shared that I recently saw a prototype ASIC-esque PCI card. The local hardware is coming, the models just need to settle down some before anyone will commit to building that hardware.

In the '90s and '00s you needed a zillion dollars of custom Silicon Graphics workstations and months of processing to do the FX for movies like "The Terminator". In 2020 you could replicate it in a few hours with commodity hardware.

The LLMs and AI will be the same, it just needs more than 5 years to get there.

[–] aesthelete@lemmy.world 1 points 10 hours ago

Yeah if you can run them locally using a small board, that'll last.

[–] boonhet@sopuli.xyz 1 points 18 hours ago (1 children)

In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

LLMs as they are, can already run on smartphones, which pretty are ubiquitous themselves.

So a flagship phone would have 12-16 gigs of RAM these days I believe. A low-end phone 4 gigs.

Here are the sizes of some different parameter count versions of Qwen 3.5, a popular Chinese open-weight LLM:

27B: 17 GB - not yet possible to run on current flagship phones, but once the RAM crisis ends, I could see this happening.

9B: 6.6 GB

4B: 3.4 GB

2B: 2.7 GB

0.8B: 1 GB.

For any recently manufactured device, there will be versions of multiple popular LLMs that will run on the RAM size they have available.

[–] aesthelete@lemmy.world 1 points 10 hours ago* (last edited 10 hours ago) (1 children)

Most people do not have a smartphone with that amount of RAM. But ultimately, yeah, eventually it'll run on readily available hardware or it'll go into a dustbin.

There's already ollama and stuff. It'll stick around.

[–] boonhet@sopuli.xyz 1 points 9 hours ago

I mean fairly low end phones are 4 GB now. They could likely afford running a model that fits in 1GB of RAM. Different models for different classes of phone even for the same manufacturer will likely be a thing.