506
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[-] r00ty@kbin.life 18 points 1 year ago

I'm not so sure disengaging autopilot because the driver's hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that's the better way?

Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end "The driver was in control at the moment of the crash" just again feels like bad "self" driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it's again a sign you shouldn't be releasing this to the public. It's clearly just not ready.

Not taking any responsibility away from the human driver here. I just don't think the behaviour was good enough for software controlling a car used by the public.

Not to mention, of course, the reason for suing Tesla isn't because they think they're more liable. It's because they can actually get some money from them.

[-] I_LOVE_VEKOMA_SLC@sh.itjust.works -1 points 1 year ago

The video is very thorough and goes into the hazy video caused by the flashing lights being one of the issues.

[-] NeoNachtwaechter@lemmy.world 5 points 1 year ago

That's not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

The emergency vehicles just happen to be your most frequent kind of obstacles.

The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.

The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That's what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.

[-] Blaidd@lemmy.world 4 points 1 year ago

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

Teslas don't use radar, just cameras. That's why Teslas crash at way higher rates than real self driving cars like Waymo.

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
this post was submitted on 14 Aug 2023
506 points (96.7% liked)

Technology

57944 readers
2982 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS