this post was submitted on 04 Feb 2026
562 points (98.8% liked)

Fuck AI

5609 readers
1970 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pseudo@jlai.lu 50 points 1 day ago (4 children)

Why use an LLM to solve a problem you could solve using an alarm clock and a post it.

[–] enbiousenvy@lemmy.blahaj.zone 33 points 1 day ago* (last edited 1 day ago) (2 children)

programming nitpicks (for the lack of better word) that I used to hear:

  • "don't use u32, you won't need that much data"
  • "don't use using namespace std"
  • "sqrt is expensive, if necessary cache it outside loop"
  • "I made my own vector type because the one from standard lib is inefficient"

then this person implemeting time checking work via LLM over network and costs $0.75 each check lol

[–] cecilkorik@piefed.ca 19 points 23 hours ago (1 children)

We used to call that premature optimization. Now we complain tasks don't have enough AI de-optimization. We must all redesign things that we have done in traditional, boring not-AI ways, and create new ways to do them slower, millions or billions of times more computationally intensive, more random, and less reliable! The market demands it!

[–] very_well_lost@lemmy.world 13 points 22 hours ago* (last edited 16 hours ago)

I call this shit zero-sum optimization. In order to "optimize" for the desires of management, you always have to deoptimize something else.

Before AI became the tech craze du jour I had a VP get obsessed with microservices (because that's what Netflix uses so it must be good). We had to tear apart a mature and very efficient app and turn it into hundreds of separate microservices... all of which took ~100 milliseconds to interoperate across the network. Pages that used to take 2 seconds to serve before now took 5 or 10 because of all the new latency required to do things they used to be able to basically for free. And it's not like this was a surprise. We knew this was going to happen.

But hey, at least our app became more "modern" or whatever...

[–] AnyOldName3@lemmy.world 7 points 21 hours ago

using namespace std is still an effective way to shoot yourself in the foot, and if anything is a bigger problem than it was in the past now that std has decades worth of extra stuff in it that could have a name collision with something in your code.

[–] Rooster326@programming.dev 5 points 18 hours ago

Nooo you don't understand. It needs it to be wrong up to 60% of the time. He would need a broken clock, a window and a post it note.

[–] rumba@lemmy.zip 2 points 20 hours ago

For the clicks.

[–] Prior_Industry@lemmy.world 1 points 1 day ago (1 children)

Or if your being fancy poll a time server

[–] pseudo@jlai.lu 1 points 19 hours ago (1 children)

That would work great as well but an alarm clock is a technology developped in the middle age.

[–] Prior_Industry@lemmy.world 1 points 18 hours ago* (last edited 18 hours ago)

Or go off grid style and leave your curtains open 😂