this post was submitted on 04 Mar 2026
1340 points (98.8% liked)

Fuck AI

6262 readers
900 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Glytch@lemmy.world 62 points 4 days ago (2 children)

Unfortunately I can't help with the acceleration. Can't cancel a subscription that never started

[–] bold_atlas@lemmy.world 21 points 4 days ago* (last edited 4 days ago) (2 children)

Break into someones house and cancel theirs.

[–] racketlauncher831@lemmy.ml 13 points 4 days ago (1 children)

Why not break into the CEO's house and unsubscribe them from their life?

[–] Tja@programming.dev 5 points 4 days ago (5 children)

Because that's murder, and contrary to a health insurance company denying claims, Sam Altman just sucks, but hasn't killed anyone (yet) (that we know of).

[–] queermunist@lemmy.ml 5 points 4 days ago (1 children)

Several teens have been groomed into killing themselves by ChatGPT.

He's culpable.

[–] Tja@programming.dev 2 points 4 days ago (3 children)

Is the developer also culpable? How about the data scientist? How about the data engineer? How about the BI Analyst? And the janitor?

How about the manufacturer of the knife / pill / gas they used to kill themselves?

[–] Mniot@programming.dev 3 points 4 days ago (1 children)

As a developer: yes to the developer and data scientist and data engineer. Scientists and engineers should be responsible for their work.

The BI analyst: maybe, if they're responsible for collecting data that ignores the impact of the service on teens. If they're doing sales-comparisons between Anthropic and OpenAI... eh, I donno.

The janitor: probably not since I don't feel like the deaths are widely publicized and they probably work for a contracting company that handles the building.

[–] Tja@programming.dev 1 points 3 days ago

That's a lot of people that are going to die for doing data mining...

[–] echodot@feddit.uk 2 points 4 days ago (1 children)

In most cases suicide isn't anyone's fault. People like to find someone to blame, and I get that, but people who are even remotely close to doing that, were always going to find a way and a justification.

No AI is going to convince me to kill myself if I didn't already want to. Equally the inverse must also be true.

That's not to say that the companies are completely off the hook, it's utterly ridiculous that these conversations weren't flagged and sent to a human, but I think it's daft to suggest that these people would necessarily still be alive had the AI not existed.

[–] Tja@programming.dev 1 points 4 days ago

I completely agree. Not off the hook. There should be better guardrails (like recipes for bombs and other dangerous things) but from there to accuse the CEO of murder there's quite a stretch.

[–] queermunist@lemmy.ml 1 points 4 days ago* (last edited 4 days ago) (1 children)

If you manufacture a knife that convinces children to kill themselves, yeah, you're culpable. Everyone else can be charged according to their level of culpability, but any time a company is found liable for killing someone the CEO should be sentenced for their murder. Maybe that would incentivize CEOs to stop getting people killed.

[–] Tja@programming.dev 1 points 4 days ago (1 children)

What about a knife that does the sliicng of the body, the killing itself?

[–] queermunist@lemmy.ml 0 points 3 days ago (1 children)

I don't think there's a difference. Children are not culpable, which means grooming children to kill themselves is murder.

[–] Tja@programming.dev 1 points 3 days ago (1 children)

Selling knifes to children is murder too?

Selling knifes to families with children?

Selling knifes to women who are pregnant?

[–] queermunist@lemmy.ml 1 points 3 days ago (1 children)

Selling knives that talk and tell you to kill yourself to children is murder.

You're refusing to recognize the grooming angle to this.

[–] Tja@programming.dev 1 points 2 days ago (1 children)

You're refusing to recognize the tool angle to this, so that makes two of us.

[–] queermunist@lemmy.ml 1 points 2 days ago* (last edited 2 days ago) (1 children)

Selling tools that kill people, knowing that they are dangerous, should have consequences.

Would the world really be a worse place if Sam Altman were tried for murder? What's the problem?

[–] Tja@programming.dev 1 points 2 days ago (1 children)

You are describing knifes, and forks, and cars, and pools, and...

[–] queermunist@lemmy.ml 1 points 2 days ago* (last edited 2 days ago) (1 children)

I'm losing patience. I'm obviously fucking not talking about regular fucking objects, a knife doesn't fucking talk and convince you to kill yourself. There's an obvious categorical difference between objects, and tools designed to trick you into thinking they're intelligent. It's murder. Someone needs to face consequences.

Why would it be bad if Sam Altman went to prison? Would the world be a worse place? Why are you protecting him?

[–] Tja@programming.dev 1 points 2 days ago (1 children)

No, a knife just severes arteries and makes you bleed out.

[–] queermunist@lemmy.ml 1 points 2 days ago (1 children)

A knife doesn't pretend to be your friend and convince you to sever your arteries. Categorically different.

Answer my question. Why would it be bad for Sam Altman to be tried for murder? If we decided that the owners of AI companies were culpable for the behavior of their chatbots and the consequences of their actions, wouldn't that solve the problem?

[–] Tja@programming.dev 1 points 2 days ago

Categorically different, one kills you, the other just talks. (it doesn't talk, but I'll humor you)

[–] Squirrelanna@lemmy.blahaj.zone 4 points 4 days ago (1 children)

I mean creating a product that exacerbates psychosis to the point that people kill themselves I would say meets that standard.

[–] Tja@programming.dev 1 points 4 days ago (1 children)

So every movie director, game designer, Popstar, etc also deserves to die?

[–] ForestGreenGhost@literature.cafe 2 points 3 days ago (1 children)

If you are unable to understand why those things are different, then you probably shouldn't comment at all.

[–] Tja@programming.dev 1 points 3 days ago (1 children)

Free speech! As long as you agree with me...

Very trumpian

[–] ForestGreenGhost@literature.cafe -1 points 3 days ago (1 children)

No I'm not saying that you're not allowed to comment. I'm just saying that your takes are stupid and that you probably shouldn't.

[–] Tja@programming.dev 2 points 2 days ago (1 children)
[–] ForestGreenGhost@literature.cafe 1 points 2 days ago (1 children)

I'm sorry that I called you stupid. That was wrong of me and you didn't deserve that.

If you're interested, I could explain to you why your comment that I initially responded to was a false equivalence, and why claiming that I was stifling your free speech is nonsensical. Let's talk it out and maybe both of us can walk away from this having learned something. :)

[–] Tja@programming.dev 1 points 2 days ago (1 children)

Sure, I'll be happy to.

My point is that chatbots, and other LLM applications, are useful tools that in isolated cases have caused people to become addicted and other harmful effects, including deaths.

The same can be said of many other things, from parasocial relationships with celebrities, tools like heavy machinery, aircraft, medicine with side effects, gyms, and a long list of others. People become obsessed, addicted and in certain cases even die. Or the tool fails and kills them.

The solution shouldn't be to immediately ban them and accuse the CEO of murder (super specific legal definition, btw) but try to regulate, add guardrails, make it safer and help the victims however they need. Sure, let's investigate each death and see if there has been negligence, but pitchforks are not the solution.

[–] ForestGreenGhost@literature.cafe 0 points 2 days ago (1 children)

Yes that is sort of true. But do you know why what you said was a false equivalence and why claiming freedom of speech doesn't make sense?

[–] Tja@programming.dev 2 points 2 days ago (1 children)
[–] ForestGreenGhost@literature.cafe 0 points 2 days ago (1 children)

"Never believe that ~~anti-Semites~~ people like this guy are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. The ~~anti-Semites~~ people like this guy have the right to play. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert. If you press them too closely, they will abruptly fall silent, loftily indicating by some phrase that the time for argument is past."

Jean-Paul Sartre

[–] Tja@programming.dev 2 points 2 days ago (1 children)

People who lack arguments cite random quotes

Me

[–] ForestGreenGhost@literature.cafe 0 points 2 days ago* (last edited 2 days ago) (1 children)

I asked you: "But do you know why what you said was a false equivalence and why claiming freedom of speech doesn’t make sense?"

That's two questions and you answered with "it does." Which doesn't answer either of them. Until you answer those questions then the quote I posted wasn't random, in fact, it calls out exactly what you're doing.

[–] Tja@programming.dev 2 points 2 days ago (1 children)

It does (answer the question).

[–] ForestGreenGhost@literature.cafe 0 points 2 days ago (1 children)
[–] Tja@programming.dev 2 points 1 day ago

You claimed it doesn't make sense. I say "it does".

[–] Urist@leminal.space 3 points 4 days ago (1 children)

Sam Altman is an enemy of humanity and it would self defense to kill him.

I'm not gonna do it because that's a hassle, but if someone did I wouldn't condemn them.

[–] Tja@programming.dev 0 points 4 days ago (1 children)

So we just advocate for the murder of anyone we disagree with? The CEO, my boss, the neighbor with the loud dog, that guy who cuts us in traffic...

[–] bold_atlas@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (1 children)

So, to you, a man hoarding wealth on an unimaginable scale and is actively engaging in the ruination of the world and humanity is just a annoying thing like a an aggressive driver or yapping dog?

And that harming this techno-Hitler for what he's doing would be the moral equivalent of murdering a normal person for making you angry?

[–] Tja@programming.dev 2 points 4 days ago

Hoarding wealth should be prevented by taxes, not murder.

[–] echodot@feddit.uk 2 points 4 days ago

Well, it's still only Wednesday

[–] racketlauncher831@lemmy.ml 2 points 4 days ago (1 children)

Exhausting energy, fresh water, and giving an excuse to corporations to strip their job, the mean of living, from employees surely isn't murdering.

[–] Tja@programming.dev 1 points 4 days ago* (last edited 4 days ago)

Exactly: it isn't murdering. Even if all assumptions above were true, it isn't murdering.

[–] Lost_My_Mind@lemmy.world 2 points 4 days ago

Well.....I'm not saying that's a bad idea, per se, but....if you are going to do this, make it blatently obvious it was a break-in.

Then, put several rubber duckies in the water tank of their toilets. Big enough that they won't fit in the hole.

See, it's the type of thing that they won't discover for months/years. They'll long have forgotten about the break-in, and won't connect the dots there.

It will just be something that confuses the fuck out of them for the rest of their lives.

[–] Joeffect@lemmy.world 4 points 4 days ago

Dude wtf did you just do... my subscription isn't active any more after reading your comment...