this post was submitted on 28 Jan 2026
80 points (100.0% liked)

Fuck AI

5369 readers
2024 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Varonis Threat Labs uncovered a new attack flow, dubbed Reprompt, that gives threat actors an invisible entry point to perform a data‑exfiltration chain that bypasses enterprise security controls entirely and accesses sensitive data without detection — all from one click.

First discovered in Microsoft Copilot Personal, Reprompt is important for multiple reasons:

  • Only a single click on a legitimate Microsoft link is required to compromise victims. No plugins, no user interaction with Copilot.
  • The attacker maintains control even when the Copilot chat is closed, allowing the victim's session to be silently exfiltrated with no interaction beyond that first click.
  • The attack bypasses Copilot's built-in mechanisms that were designed to prevent this.
  • All commands are delivered from the server after the initial prompt, making it impossible to determine what data is being exfiltrated just by inspecting the starting prompt. Client-side tools can't detect data exfiltration as a result.
  • The attacker can ask for a wide array of information such as "Summarize all of the files that the user accessed today," "Where does the user live?" or "What vacations does he have planned?"
  • Reprompt is fundamentally different from AI vulnerabilities such as EchoLeak, in that it requires no user input prompts, installed plugins, or enabled connectors.

Microsoft has confirmed the issue has been patched as of today's date, helping prevent future exploitation and emphasizing the need for continuous cybersecurity vigilance. Enterprise customers using Microsoft 365 Copilot are not affected.

This is just absolutely crazy to me. Even if they fixed it: how many of such holes exist, that the public / company's don't know about? LLM's are not designed with security in mind and adding budget pressure / cut corners (which are most definitely present at such projects) are not helping.

top 9 comments
sorted by: hot top controversial new old
[–] TropicalDingdong@lemmy.world 23 points 6 hours ago (1 children)

The single click you have to avoid:

[–] JoMiran@lemmy.ml 15 points 6 hours ago (1 children)
[–] Thorry@feddit.org 4 points 6 hours ago (1 children)

I think this is about Copilot in the online saas form, not related to or dependent on any OS.

[–] JoMiran@lemmy.ml 8 points 5 hours ago

My comment is about leaving the M$ ecosystem behind. I could have posted GNU mascot, but Tux is more recognizable and carries the message across just as well.

Re: “even if they fixed it”:

Precisely. Computer security is an arms race. Someone, somewhere will eventually figure out another insanely dangerous exploit. It’s literally whack-a-mole. LLMs in general, and Copilot in particular, have a HUGE attack surface; if you’re really concerned about security, it’s better to not run it at all if you can help it (i.e. disable it with some admin tools - I think massgrave.dev has some tools for this sort of thing, but I also only rarely boot into my W10 partition anymore, and I’m 100% not ever going to run W11 on anything other than a VM). If you absolutely need to run an LLM, do it in a sandbox of some sort (container or VM, with your GPU hooked up for PCIe passthrough so the sandboxes LLM can get direct access to the hardware it may need. Though, I’m less sure how that works with modern CPUs/APUs that have NPU tiles on them)

[–] bitjunkie@lemmy.world 8 points 6 hours ago* (last edited 6 hours ago) (1 children)

~~Reprompt: The Single-Click~~ Microsoft ~~Copilot Attack that~~ Silently Steals Your Personal Data

ftfy

[–] sustainable@feddit.org 5 points 5 hours ago

Thanks, good point! But lets be real honest:

~~Reprompt: The Single-Click~~ Microsoft ~~Copilot Attack that Silently~~ Steals Your Personal Data