this post was submitted on 28 Jun 2025
114 points (100.0% liked)
TechTakes
2005 readers
97 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A computer can't be held accountable.
In theory, at least, criminal justice's purpose is prevention of crimes. And if it would serve that purpose to arrest a person, it would serve that same purpose to court-order a shutdown of a chatbot.
There's no 1st amendment right to enter into criminal conspiracies to kill people. Not even if "people" is Sam Altman.
In practice the justice system actually is reactionary. Either the actuality of a crime or the suspect of a crime being possible allows for laws to be created prohibiting that crime, marking it as criminal, and then law enforcement and the justice system as a whole investigate instances where that crime is suspected to be committed and litigation ensues.
Prevention may be the intent, but the actuality is that we know this doesn't prevent crime. Outside the jurisdiction of any justice system that puts such "safeguards" in place is a place where people will abuse that lack of jurisdiction. And people inside it with enough money or status or both will continue to abuse it for their personal gain. Which is pretty much what's happening now, with the exception that they have realized they can try to preempt litigation against them by buying the litigants or part of the regulatory/judicial system.
If it was a basement dweller with a chatbot that could be mistaken for a criminal co-conspirator, he would've gotten arrested and his computer seized as evidence, and then it would be a crapshoot if he would even be able to convince a jury that it was an accident. Especially if he was getting paid for his chatbot. Now, I'm not saying that this is right, just stating how it is for normal human beings.
It may not be explicitly illegal for a computer to do something, but you are liable for what your shit does. You can't just make a robot lawnmower and run over a neighbor's kid. If you are using random numbers to steer your lawnmower... yeah.
But because it's OpenAI with 300 billion dollar "valuation", absolutely nothing can happen whatsoever.