this post was submitted on 22 Mar 2025
160 points (97.1% liked)

Technology

67338 readers
4194 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

“The rise of AI agents like Operator shows the dual nature of technology — tools built for productivity can be weaponized by determined attackers with minimal effort. This research highlights how AI systems can be manipulated through simple prompt engineering to bypass ethical guardrails and execute complex attack chains that gather intelligence, create malicious code, and deliver convincing social engineering lures.”

top 7 comments
sorted by: hot top controversial new old
[–] BoulevardBlvd@lemmy.blahaj.zone 58 points 2 days ago (2 children)

I mean, I get the idea that it's easier to do this sort of thing at scale with this technology, but there are already entire sophisticated corporations overseas dedicated solely to phishing at scale already using ultra cheap labor so like ... Is anything really going to change other than those scam companies going out of business due to grassroots competition? I don't really see how this changes anything other than the labor budget of phishing companies.

[–] echodot@feddit.uk 6 points 1 day ago

These sort of scams are always going after the technologically illiterate and the elderly. They don't need AI to make them more sophisticated, in fact making them more sophisticated might be counterproductive because then they'll waste time stringing people along for longer, only for them to get suspicious later on once the scam becomes obvious, because at some point it will become obvious, it's usually the point at which they try to get you to mail them cash.

So it's better for them to just be as blatant as possible to weed out the people who will never send them any money but might get strung along for a little while.

[–] theshoeshiner@lemmy.world 16 points 1 day ago* (last edited 1 day ago) (1 children)

When something becomes substantially easier to do, the prevalence of it is going to increase substantially as well. So it's not that phishing is going to get any more complex or deceptive. It's that it's going to come from 10x+ more endpoints. And while you personally may feel immune, it's all a numbers game to the scammer. The more attacks they send out, the higher their success rate.

If you're already getting 10 scam calls and texts a day, imagine getting 100. If you're getting 100, imagine getting 1000.

[–] OpenStars@piefed.social 6 points 1 day ago

I remember back when there was a time where when this occurred in the USA, it was the telephone company's responsibility to shut it down, for all the people getting all that spam from a particular source at once.

Then that was changed, sometime after Obama and before Biden.

[–] Tea@programming.dev 12 points 2 days ago (1 children)
[–] Flagstaff@programming.dev 1 points 1 day ago (1 children)

So... it's becoming truer, then?

[–] mojofrododojo@lemmy.world 1 points 21 hours ago

becomining truthier