this post was submitted on 06 May 2025
1207 points (98.2% liked)

Fuck AI

3528 readers
167 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
1207
skills for rent (lemmy.blahaj.zone)
submitted 2 months ago* (last edited 2 months ago) by not_IO@lemmy.blahaj.zone to c/fuck_ai@lemmy.world
 
(page 3) 50 comments
sorted by: hot top controversial new old
[–] ikidd@lemmy.world 3 points 2 months ago (2 children)

You can use local models for free, it's just slower.

[–] Knock_Knock_Lemmy_In@lemmy.world 1 points 2 months ago

And local is usually less parameters. Reasoning on a local model is very poor.

[–] CMonster@discuss.online 1 points 2 months ago (8 children)

Why would you not want to use all the tools available to be as efficient as possible?

load more comments (8 replies)
[–] Notserious@lemmy.ca 2 points 2 months ago

You can always tell when your on a new bug when you ask about error “exception when calling…” and AI returns your exact implementation of the error back as a solution.

Not really intelligent

[–] oo1@lemmings.world 2 points 2 months ago (2 children)

WTF "vibe coding"? I'm not even wasting the electricity to googgle that one.

load more comments (2 replies)
[–] vane@lemmy.world 2 points 2 months ago* (last edited 2 months ago)

I still think that local models in places without internet are better then offline documentation.

load more comments
view more: ‹ prev next ›