this post was submitted on 18 May 2025
63 points (100.0% liked)

TechTakes

1865 readers
127 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Video version

Podcast version if you hate pictures

you are viewing a single comment's thread
view the rest of the comments
[–] rook@awful.systems 24 points 21 hours ago* (last edited 19 hours ago) (3 children)

When confronted with a problem like “your search engine imagined a case and cited it”, the next step is to wonder what else it might be making up, not to just quickly slap a bit of tape over the obvious immediate problem and declare everything to be great.

The other thing to be concerned about is how lazy and credulous your legal team are that they cannot be bothered to verify anything. That requires a significant improvement in professional ethics, which isn’t something that is really amenable to technological fixes.

[–] diz@awful.systems 9 points 15 hours ago* (last edited 15 hours ago) (1 children)

When confronted with a problem like “your search engine imagined a case and cited it”, the next step is to wonder what else it might be making up, not to just quickly slap a bit of tape over the obvious immediate problem and declare everything to be great.

Exactly. Even if you ensure the cited cases or articles are real it will misrepresent what said articles say.

Fundamentally it is just blah blah blah ing until the point comes when a citation would be likely to appear, then it blah blah blahs the citation based on the preceding text that it just made up. It plain should not be producing real citations. That it can produce real citations is deeply at odds with it being able to pretend at reasoning, for example.

Ensuring the citation is real, RAG-ing the articles in there, having AI rewrite drafts, none of these hacks do anything to address any of the underlying problems.

[–] kbotc@lemmy.world 3 points 11 hours ago

Yea, and if you’re going to let the AI write the structure and have a lawyer go and rewrite the whole thing after validating it, why not remove the step and just have said lawyer actually write the brief and put their accreditation on the line?

[–] o7___o7@awful.systems 5 points 19 hours ago

We have got to bring back the PE exam for software engineering.

[–] BlueMonday1984@awful.systems 3 points 18 hours ago

That requires a significant improvement in professional ethics, which isn’t something that is really amenable to technological fixes.

That goes some way to explaining why programmers don't have a moral compass.