505
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 09 Jul 2023
505 points (97.0% liked)
Technology
59740 readers
2537 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
If you're doing research, there are actually some limits on the use of the source material and you're supposed to be citing said sources.
But yeah, there's plenty of stuff where there needs to be a firm line between what a random human can do versus an automated intelligent system with potential unlimited memory/storage and processing power. A human can see where I am in public. An automated system can record it for permanent record. An integrated AI can tell you detailed information about my daily activities including inferences which - even if legal - is a pretty slippery slope.
I think we need a better definition here. Is the issue really the processing power? Do we let humans get a pass because our memories are fuzzy? From your example you're assuming massive details are maintained in the AI situation which is typically not the case. To make the data useful it's consumed and turned into something useful for the system.
This is why I'm worried about legislation and legal precedent. Most people think these AI systems read a book and store the verbatim text off somewhere to reference when that isn't really the case. There may be fragments all over, and it may be able to reconstitute the text, but we don't seem to have the same issue with data being synthesized in a similar way with a human brain.
A continuous record of location + time or even something like "license plate at location plus time" is scary enough to me, and that's easily data a system could hold decades of
Is that scary because it's a machine? Someone could tail you and follow you around and manually write it all down in a notebook.
Yes the ease of data collection is an issue and I'm very much for better privacy rights for us all. But from the issue you've stated I'd be more afraid of what the 70 year old politicians who don't understand any of this would write up in a bill.
They could, and then they could also be charged with stalking.
It's not just ease of collection. It's how the data is being retained, secured, and shared among a great many other things. Laws just haven't kept up with technology, partly because yeah 70yo politicians that don't even understand email but also because the corporations behind the technology lie and bribe to keep it that way, and face little consequences when they do so improperly or mishandle it. E.G.
https://www.cbc.ca/news/politics/cadillac-fairview-5-million-images-1.5781735
When the government does it, we seem to have even less recourse.
Would it be stalking if you signed a legal agreement that allowed them to track you? That is the reason the California law exists. Most of us have accepted a license agreement to us an app or service and in exchange we gave up privacy rights. And it may not have even been with the company consuming the data.
Sadly the law requires you to contact everyone to demand your data be deleted. Passing a law to have the default be never store my data means most of social media goes away or goes behind a paywall. This also goes for any picture hosting company who charges you nothing for hosting as they use your images.
This would also most likely mean that very explicit declarations must be made to allow anyone to use your material causing a lot of business to say it's too big of a risk and ditch a lot of support.
Right now we kind of work on good faith which maybe doesn't work.