this post was submitted on 04 Jun 2025
64 points (95.7% liked)

Privacy

2479 readers
421 users here now

Welcome! This is a community for all those who are interested in protecting their privacy.

Rules

PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!

  1. Be civil and no prejudice
  2. Don't promote big-tech software
  3. No apathy and defeatism for privacy (i.e. "They already have my data, why bother?")
  4. No reposting of news that was already posted
  5. No crypto, blockchain, NFTs
  6. No Xitter links (if absolutely necessary, use xcancel)

Related communities:

Some of these are only vaguely related, but great communities.

founded 6 months ago
MODERATORS
 

I can't believe I'm on OpenAI's side here

top 18 comments
sorted by: hot top controversial new old
[–] dissentiate@lemmy.dbzer0.com 49 points 1 day ago (3 children)

"slams"...

For fucks sake... sigh...

[–] barnaclebutt@lemmy.world 6 points 1 day ago

Look they are very unhappy that they need to keep the evidence of all their piracy. You would be too. it's terrible for their case!

[–] Colloidal@programming.dev 1 points 22 hours ago

I'm sick of this too.

[–] sevon@lemmy.kde.social 4 points 23 hours ago

sLaMs!!!!!!111

[–] Pogogunner@sopuli.xyz 22 points 1 day ago (1 children)

I think the only reason OpenAI would delete their logs is to destroy evidence. I don't trust for a second they care about anyone's privacy.

[–] lka1988@lemmy.dbzer0.com 7 points 1 day ago* (last edited 1 day ago)

Exactly. There's a reason why user-accessible file structures and categories in things like email, OS functions, and even cloud databases prominently feature search bars. There's a lot of data to be had from search bars. And people having full-on conversations with AI systems is rife with that kind of data.

[–] JeeBaiChow@lemmy.world 10 points 1 day ago (1 children)

Naive to think any company actually deletes data, unless it's literally their business to do so, or they're destroying evidence. Even so, there's probably multiple copies available. It's simply too valuable. They just flag it as inactive, but I'd bet it's still there for data mining or training.

[–] ramble81@lemm.ee 6 points 1 day ago (3 children)

Nope! We have policies of regularly deleting data as leftover data can turn into a legal nightmare, especially when it comes to discovery. It’s much easier to point them to the policy of why it isn’t there than to try to compile and give it to them and then potentially have something buried in there. The only thing we keep longer are things legally obligated.

[–] shalafi@lemmy.world 4 points 1 day ago (1 children)

Gods yes. Destroy that data, per written policy. Who the fuck wants to pay a team of lawyers $500 each to hunt through your data.

keep longer are things legally obligated

Bingo! Then smoke it, automatically and forever.

[–] Onomatopoeia@lemmy.cafe 3 points 1 day ago* (last edited 1 day ago)

Yep, if the data doesn't exist because (insert relevant policy here) you have legal defensibiity for not producing it.

And if there's no legal or regulatory requirement for retaining said data... You don't

[–] Onomatopoeia@lemmy.cafe 2 points 1 day ago (1 children)

That would be covered under the "destroying evidence" - It's just being destroyed before it can be determined to be evidence, which is legal if done because of retention policies.

(Identifying data and establishing which policies apply to it is part of my work, I just find it ironic that we're effectively pre-deleting evidence).

[–] ramble81@lemm.ee 1 points 1 day ago* (last edited 1 day ago)

It’s just data until it can be considered evidence. The moment we get a discovery letter, of course we’re legally obligated to preserve the records, but until then it’s just company data and we can do with it whatever we want, including destroying it, otherwise everything in the world is “evidence”

[–] grysbok 1 points 1 day ago

I took a whole class on this in library school--"records management"--and it was kind of fun watching the more archives-focused students wrap their head around getting rid of data/records as soon as legally allowed.

[–] chicken@lemmy.dbzer0.com 9 points 1 day ago* (last edited 1 day ago) (1 children)

This kind of reminds me of that time Apple made a big show of resisting court efforts to get them to unlock iphone data; they have every reason to cultivate an impression of caring about privacy, but this isn't actually evidence that they do. Giving them all this info about your life by holding a continual ongoing conversation about what you're doing on their servers is inherently pretty bad for your privacy in a way reassurances can't really fix.

There's a lot of reasons to prefer local AI instead and this is a big one.

[–] sunzu2@thebrainbin.org 4 points 1 day ago

If it ain't local, it ain't worth the exposure IMHO.

At this point, giving these parasites information is enabling your own enslavement. The quicker people get with the agenda, the easier it will be to fight back.

[–] b_tr3e@feddit.org 2 points 1 day ago* (last edited 1 day ago)

You probably aren't. You're on somebody's side who described OpenAI's point of view. You won't have an independent opinion before you've read at least one different source. One that is not owned by a huge media conglomerate, that is. Everyone seems willing to make up their minds after somebody somewhere told them something without verification. That's how you're getting a poorly informed half-wit without an idea of critical thinking.

[–] DarkCloud@lemmy.world 0 points 1 day ago* (last edited 1 day ago)

Isn't there some problem with their Business APIs that a bunch of the training data isn't anonymized or can be easily deanonymized... Yet that data, gets sculpted, repacked, and transmitted regularly.

It's all because they thought you couldn't see "inside" the LLMs real time "thoughts", when actually you can (so encryption is non-existent at that point, which is a security risk)..