this post was submitted on 28 Jun 2025
114 points (100.0% liked)

TechTakes

2005 readers
97 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] diz@awful.systems 19 points 23 hours ago* (last edited 23 hours ago) (1 children)

It's curious how if ChatGPT was a person - saying exactly the same words - he would've gotten charged with a criminal conspiracy, or even shot, as its human co-conspirator in Florida did.

And had it been a foreign human in the middle east, radicalizing random people, he would've gotten a drone strike.

"AI" - and the companies building them - enjoy the kind of universal legal immunity that is never granted to humans. That needs to end.

[–] irelephant@lemmy.dbzer0.com 12 points 20 hours ago (1 children)

A computer can't be held accountable.

[–] diz@awful.systems 6 points 19 hours ago* (last edited 19 hours ago) (1 children)

In theory, at least, criminal justice's purpose is prevention of crimes. And if it would serve that purpose to arrest a person, it would serve that same purpose to court-order a shutdown of a chatbot.

There's no 1st amendment right to enter into criminal conspiracies to kill people. Not even if "people" is Sam Altman.

[–] atrielienz@lemmy.world 3 points 18 hours ago (1 children)

In practice the justice system actually is reactionary. Either the actuality of a crime or the suspect of a crime being possible allows for laws to be created prohibiting that crime, marking it as criminal, and then law enforcement and the justice system as a whole investigate instances where that crime is suspected to be committed and litigation ensues.

Prevention may be the intent, but the actuality is that we know this doesn't prevent crime. Outside the jurisdiction of any justice system that puts such "safeguards" in place is a place where people will abuse that lack of jurisdiction. And people inside it with enough money or status or both will continue to abuse it for their personal gain. Which is pretty much what's happening now, with the exception that they have realized they can try to preempt litigation against them by buying the litigants or part of the regulatory/judicial system.

[–] diz@awful.systems 6 points 17 hours ago* (last edited 17 hours ago)

If it was a basement dweller with a chatbot that could be mistaken for a criminal co-conspirator, he would've gotten arrested and his computer seized as evidence, and then it would be a crapshoot if he would even be able to convince a jury that it was an accident. Especially if he was getting paid for his chatbot. Now, I'm not saying that this is right, just stating how it is for normal human beings.

It may not be explicitly illegal for a computer to do something, but you are liable for what your shit does. You can't just make a robot lawnmower and run over a neighbor's kid. If you are using random numbers to steer your lawnmower... yeah.

But because it's OpenAI with 300 billion dollar "valuation", absolutely nothing can happen whatsoever.