580
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 20 Apr 2024
580 points (96.9% liked)
Technology
60035 readers
2856 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything's cool?
Yes.
Yes, the judge will let the driver off the hook, because Mercedes told them it will assume the liability instead.
According to that teal light.
You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?
Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.
But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.
I think this idea is sound, but that doesn't mean there aren't things to address around it.
Honestly I'm sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.
That still doesn't address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because "it'll eventually get better" is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.
But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn't clearly codified law I might not, and you might be callous enough to say you don't care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you'd have to go through a long, expensive court battle to determine fault because no one had done it it. So you're in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn't your fault.
That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.
But then it's good that the manufacturer states the driver isn't obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it's a great incentive to make technology as safe as possible.
To be clear I never said that I didn't care about an individual's safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.
The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn't count as it's not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.
https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars
Can't the entry point just be that you have to pay attention while it's driving for you until they figure it out?
You’re deciding to prioritize economic development over human safety.
*at 40mph on a clear straight road on a sunny day in a constant stream of traffic with no unexpected happenings, Ts&Cs apply.
Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.