this post was submitted on 02 Apr 2026
677 points (99.6% liked)

Fuck AI

6576 readers
1631 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] boonhet@sopuli.xyz 1 points 20 hours ago (1 children)

In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

LLMs as they are, can already run on smartphones, which pretty are ubiquitous themselves.

So a flagship phone would have 12-16 gigs of RAM these days I believe. A low-end phone 4 gigs.

Here are the sizes of some different parameter count versions of Qwen 3.5, a popular Chinese open-weight LLM:

27B: 17 GB - not yet possible to run on current flagship phones, but once the RAM crisis ends, I could see this happening.

9B: 6.6 GB

4B: 3.4 GB

2B: 2.7 GB

0.8B: 1 GB.

For any recently manufactured device, there will be versions of multiple popular LLMs that will run on the RAM size they have available.

[–] aesthelete@lemmy.world 1 points 12 hours ago* (last edited 12 hours ago) (1 children)

Most people do not have a smartphone with that amount of RAM. But ultimately, yeah, eventually it'll run on readily available hardware or it'll go into a dustbin.

There's already ollama and stuff. It'll stick around.

[–] boonhet@sopuli.xyz 1 points 10 hours ago

I mean fairly low end phones are 4 GB now. They could likely afford running a model that fits in 1GB of RAM. Different models for different classes of phone even for the same manufacturer will likely be a thing.