this post was submitted on 07 Mar 2026
847 points (97.4% liked)
Technology
82361 readers
5004 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you've ever used it you can see how easily it can happen.
At first you Sandbox box it and you're careful. Then after a while the sand box is a bit of a pain so you just run it as is. Then it asks for permission a 1000 times to do something and at first you carefully check each command but after a while you just skim them and eventually, sure you can run 'psql *' to debug some query on the dev instance....
It's one of the major problems with the "full self driving" stuff as well. It's right often enough that eventually you get complacent or your attention drifts elsewhere.
This kind of stuff happened before the LLM coding agents existed, they have just supercharged the speed and as a result increased the amount of damage that can be done before it's noticed.
There are already a bunch of failures in place for something like this to happen. Having the prod credentials available etc etc it's just now instead of rolling the dice every couple weeks your LLM is rolling them every 20s.
How could this happen easily? A regular developer shouldn’t even have access to production outside of exceptional circumstances (e.g. diagnosing a production issue). Certainly not as part of the normal dev process.
They shouldn't and we know that but this is hardly the first time that story has been told even before LLMs. Usually it was blamed on "the intern" or whatever.
This isn’t just an issue with a developer putting too much trust into an LLM though. This is a failure at the organizational level. So many things have to be wrong for this to happen.
If an ‘intern’ can access a production database then you have some serious problems. No one should have access to that in normal operations.
Sure, I'm not telling you how it should be, I'm telling you how it is.
The LLM just increases the damage done because it can do more damage faster before someone figures out they fucked up.
This is the last big one I remembered offhand but I know it happens a couple times a year and probably more just goes unreported.
https://www.cnn.com/2021/02/26/politics/solarwinds123-password-intern
Why would an intern be given prod supply chain credentials, who knows. People fuck up all the time.
Yes, I can see how it can easily happen to stupid lazy people.