this post was submitted on 06 May 2025
1207 points (98.2% liked)

Fuck AI

3226 readers
593 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
1207
skills for rent (lemmy.blahaj.zone)
submitted 1 month ago* (last edited 1 month ago) by not_IO@lemmy.blahaj.zone to c/fuck_ai@lemmy.world
 
you are viewing a single comment's thread
view the rest of the comments
[–] ikidd@lemmy.world 3 points 1 month ago (2 children)

You can use local models for free, it's just slower.

And local is usually less parameters. Reasoning on a local model is very poor.

[–] CMonster@discuss.online 1 points 1 month ago (2 children)

Why would you not want to use all the tools available to be as efficient as possible?

[–] ikidd@lemmy.world 3 points 1 month ago

There's a lot of dev teams that have to use local because their code is proprietary and they don't want it getting outside the network.