this post was submitted on 19 Feb 2026
159 points (99.4% liked)

Fuck AI

6005 readers
1485 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Critical thinking truly is dying, and at such speed I can barely believe my eyes. This is tough read.

all 20 comments
sorted by: hot top controversial new old
[–] jpreston2005@lemmy.world 20 points 4 days ago* (last edited 4 days ago) (4 children)

Wow, men were already creepy and stalker-y enough, now we got some constantly on-line machine that's whole job seems to be validating whatever crazy behavior the person is engaged in... Quote from the article is now you "...can have mob mentality without the mob."

That's scary. Internet used to let you connect with other people that had the same interests as you, making you feel a lot less alone in the world. Now, it doesn't bother connecting you with anyone, it just gives you a nice positive feedback loop of your own crazy turned to 11. Everybody with their own, personal fan-girl in their pocket, available any time of day or night. Man people are just too stupid, you can't be going around validating everybody's ideas. Some people have some crazy fucking ideas

[–] Reygle@lemmy.world 14 points 4 days ago

I hear you. I refuse to touch AI and am starting to look in to how I can do my part to "poison" their data, but it's mentally messing up my coworkers and family members already.

[–] captainlezbian@lemmy.world 13 points 4 days ago

It's not even stupid, it's that we're a social species and we do error correction by running thoughts by others. You're biologically primed to assume friends, family, etc are of similar authority on truth to yourself. When we disagree with the crowd we reevaluate. Thats how cults can reprogram smart people by isolating them

[–] nwtreeoctopus@sh.itjust.works 8 points 4 days ago

The "lack of friction" aspect seems like an understated issue. I think it's bad, but less problematic, in issues like shopping online but it's so important to see that "WTF?" look on people around you when you've got a crazy idea.

Chatbots are like an improv exercise in which you have to "yes, and" any madness pitched your way.

[–] SalamenceFury@piefed.social 14 points 4 days ago* (last edited 4 days ago) (1 children)

Jesus Christ, this technology has to be destroyed. It literally turned a normal dude into a delusional stalker. I've observed that most people who tend to hate LLMs are artists and/or queer people, of which a lot of them have some sort of mental illness and know a thing or two about being manic and thus refuse to engage with chatbots because they know it'll destroy their lives (it's me, I'm both, and so are all of my friends). Normies have no such safeguards or experience and thus are the most vulnerable to having their brains cooked by a chatbot.

[–] captainlezbian@lemmy.world 11 points 4 days ago* (last edited 4 days ago) (1 children)

Jesus fuck that's bad. And it really reminds me of my experiences being targeted by a delusional woman with a personality disorder, which is something I'd never wish on anyone. In fact it kinda makes me wonder if she was using ai during the whole thing (probably not, she seemed to be lifelong crazy)

Also this tech feels increasingly lovecraftian: we're churning up the destruction of the fundamental particles of matter using a technology made to end the world to create horrible machines to think unthought thoughts that drive people to a destructive madness

[–] wizblizz@lemmy.world 1 points 2 days ago

Eldritch horrors that whisper madness is a solid comparison. I think the Reapers in Mass Effect are great also, their indoctrination supplanting personal belief and thought with their own.

[–] Asidonhopo@lemmy.world 7 points 4 days ago

With how aggressively it's being pushed I wonder what percentage of CEOs and their ilk are in full blown AI psychosis at this point

[–] BlackRoseAmongThorns@slrpnk.net 9 points 4 days ago (1 children)

Remotely relevant but i recently called my mother, whom I've cut contact from for years, to check on her after some distant family reached out to me to beg me to make sure she's fine.

She started talking about coming up with an "explanation" to some internal family drama, proudly exclaiming she came up with it all by herself, basically a conspiracy blaming my younger brother, I'm afraid she was using an LLM to affirm her delusions.

Wish i had more to say, this world can be so cruel :(

[–] Reygle@lemmy.world 9 points 4 days ago (1 children)

I hear you. "AI Psychosis" is a real thing and it's messing up tons of people. https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis

If things are rough enough that you had to cut your mother from contact, remember you can always call local authorities for a wellness check, because it sounds like speaking directly with her may not be good for you, either.

... it sounds like speaking directly with her may not be good for you, either.

It isn't, contact was cut a few years ago because of that, as I'm not from the US I'll have to research what's the closest thing to a "wellness check" i can get in my country, hopefully not something involving the police.

[–] qprimed@lemmy.ml 7 points 4 days ago

as OP indicates, its a rough read - but worth your time and sharing. thanks OP.

[–] deliriousdreams@fedia.io 5 points 4 days ago

I wonder if "leading to" is the right phrase.

In my view of things, echo chambers have always existed and it could just as easily been an "AITA" post on reddit that fed this man's delusions and convinced him to stalk this woman.

The fact is, this happens more than society likes to think and even when you think you're safe and the person's fixation on you has wanted, you may find yourself in danger because that feeling of distance over time has only lead to a false sense of security.

lt is apparent to me that Generative AI Chatbots are heavily exacerbating the situation though. For one because there's no friction in their echo chamber. An AITA post or posts like it are likely to get at least some users who disagree with the general consensus. That the friction matters.

I have questions about the validity of basing a person's mental health on their view of it over years because for the most part everyone has trauma and most people aren't medical professionals with the right skill set to diagnose or even properly observe key behaviors that would factor into such a diagnosis.

Every time I see one of these articles they seem to highlight that the people involved never experienced "mania" or "psychosis". But it's usually them saying that about themselves not a mental health professional commenting on it.