"slams"...
For fucks sake... sigh...
Welcome! This is a community for all those who are interested in protecting their privacy.
PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!
Some of these are only vaguely related, but great communities.
"slams"...
For fucks sake... sigh...
Look they are very unhappy that they need to keep the evidence of all their piracy. You would be too. it's terrible for their case!
I'm sick of this too.
sLaMs!!!!!!111
I think the only reason OpenAI would delete their logs is to destroy evidence. I don't trust for a second they care about anyone's privacy.
Exactly. There's a reason why user-accessible file structures and categories in things like email, OS functions, and even cloud databases prominently feature search bars. There's a lot of data to be had from search bars. And people having full-on conversations with AI systems is rife with that kind of data.
Naive to think any company actually deletes data, unless it's literally their business to do so, or they're destroying evidence. Even so, there's probably multiple copies available. It's simply too valuable. They just flag it as inactive, but I'd bet it's still there for data mining or training.
Nope! We have policies of regularly deleting data as leftover data can turn into a legal nightmare, especially when it comes to discovery. It’s much easier to point them to the policy of why it isn’t there than to try to compile and give it to them and then potentially have something buried in there. The only thing we keep longer are things legally obligated.
Gods yes. Destroy that data, per written policy. Who the fuck wants to pay a team of lawyers $500 each to hunt through your data.
keep longer are things legally obligated
Bingo! Then smoke it, automatically and forever.
Yep, if the data doesn't exist because (insert relevant policy here) you have legal defensibiity for not producing it.
And if there's no legal or regulatory requirement for retaining said data... You don't
That would be covered under the "destroying evidence" - It's just being destroyed before it can be determined to be evidence, which is legal if done because of retention policies.
(Identifying data and establishing which policies apply to it is part of my work, I just find it ironic that we're effectively pre-deleting evidence).
It’s just data until it can be considered evidence. The moment we get a discovery letter, of course we’re legally obligated to preserve the records, but until then it’s just company data and we can do with it whatever we want, including destroying it, otherwise everything in the world is “evidence”
I took a whole class on this in library school--"records management"--and it was kind of fun watching the more archives-focused students wrap their head around getting rid of data/records as soon as legally allowed.
This kind of reminds me of that time Apple made a big show of resisting court efforts to get them to unlock iphone data; they have every reason to cultivate an impression of caring about privacy, but this isn't actually evidence that they do. Giving them all this info about your life by holding a continual ongoing conversation about what you're doing on their servers is inherently pretty bad for your privacy in a way reassurances can't really fix.
There's a lot of reasons to prefer local AI instead and this is a big one.
If it ain't local, it ain't worth the exposure IMHO.
At this point, giving these parasites information is enabling your own enslavement. The quicker people get with the agenda, the easier it will be to fight back.
You probably aren't. You're on somebody's side who described OpenAI's point of view. You won't have an independent opinion before you've read at least one different source. One that is not owned by a huge media conglomerate, that is. Everyone seems willing to make up their minds after somebody somewhere told them something without verification. That's how you're getting a poorly informed half-wit without an idea of critical thinking.
Isn't there some problem with their Business APIs that a bunch of the training data isn't anonymized or can be easily deanonymized... Yet that data, gets sculpted, repacked, and transmitted regularly.
It's all because they thought you couldn't see "inside" the LLMs real time "thoughts", when actually you can (so encryption is non-existent at that point, which is a security risk)..