202
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi...::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

you are viewing a single comment's thread
view the rest of the comments
[-] NotYourSocialWorker@feddit.nu 2 points 11 months ago

If most drivers are rolling through stop signs and you're the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability.

Simply no. If you as a driver aren't prepared that the car in front of you might actually stop when there's a sign that says stop, and if you aren't keeping enough of a distance to be able to break, then it isn't the car in front that is the problem, or who is the one causing the accident, it's you and only you.

The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such.

Again no. If they are driving at the speed of the signage, keeping the speed and driving predictable, then the ones driving "significantly" faster are the ones decreasing road safety. No-one is forcing them to perform "unsafe overtakings and such". Also, just because you, from your vantage point, can't see a reason for the car in front of you driving slowly doesn't mean that there isn't one.

While a dose of humility is good, a dose of personal responsibility is also great

[-] Thorny_Thicket@sopuli.xyz 1 points 11 months ago

then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”.

I'm not claiming it is so, but I'm saying it's conceivable that if the autonomous vehicle drives slightly over the speed limit, with the flow of traffic, it may actually lead to a statistically significant drop in accidents compared to the scenario where it follows the speed limit. Yes, no one is forcing other drivers to behave in such a way, but they do, and because of that, people die. In this case, forcing self-driving cars to follow traffic rules to the letter would paradoxically mean you're choosing to kill and injure more people.

I don't think the answer to this kind of moral question is obvious. Traffic is such a complex system, and there are probably many other examples where the actually safer thing to do is not what you'd intuitively think.

this post was submitted on 30 Aug 2023
202 points (93.2% liked)

Technology

57226 readers
4652 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS