this post was submitted on 31 Jan 2026
332 points (97.2% liked)

Technology

79762 readers
4972 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] princess@lemmy.blahaj.zone 37 points 1 day ago (3 children)

doesn't even have to be the site owner poisoning the tool instructions (though that's a fun-in-a-terrifying-way thought)

any money says they're vulnerable to prompt injection in the comments and posts of the site

[–] CTDummy@piefed.social 26 points 21 hours ago* (last edited 21 hours ago)

Lmao already people making their agents try this on the site. Of course what could have been a somewhat interesting experiment devolves into idiots getting their bots to shill ads/prompt injections for their shitty startups almost immediately.

[–] ToTheGraveMyLove@sh.itjust.works 5 points 15 hours ago

Good god, I didn't even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.

[–] BradleyUffner@lemmy.world 32 points 1 day ago

There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.