this post was submitted on 03 Sep 2023
182 points (88.6% liked)

Technology

69999 readers
4576 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] skymtf@lemmy.blahaj.zone 103 points 2 years ago (35 children)

I feel like the NTSB need to draft a min spec for self driving cars and a testing course that involves some of the worst circtimstances to get approved. I feel like all self driving cars should have to have lidar, and other sensors. Computer vision really isn't working out.

[–] SuperSleuth@lemm.ee 12 points 2 years ago (6 children)

Should a self-driving car face more rigorous tests than actual human drivers? Honest question.

[–] optissima@lemmy.world 28 points 2 years ago (1 children)
[–] stopthatgirl7@kbin.social 8 points 2 years ago

Yes, because when there’s an accident with a person driving, you usually know exactly who is legally to blame in an accident. With self-driving, if the car accidentally hits and kills someone, who do you charge for it? There’s no one person you can point to for responsibility for if something goes wrong, like you can for a person responsible for an accident.

[–] FoxBJK@midwest.social 15 points 2 years ago (1 children)

Human drivers should be facing more rigorous testing regardless. It’s horrifically easy to get a license… and then they never test you again for the rest of your life. That’s just insane when you think about it. My test was in 2002. Feels like I should have to retake it at some point.

[–] TenderfootGungi@lemmy.world 4 points 2 years ago

And take them away for bad driving. But we don’t because our entire transportation infrastructure, outside of a few cities namely NY, is built around everyone driving a car.

[–] IphtashuFitz@lemmy.world 13 points 2 years ago (1 children)

Yes. A human brain can handle edge cases it’s never encountered before. Can a self driving car?

  • Ever stop at a red light only to have a police officer wave you through?

  • Ever encounter a car driving the wrong way down a one way street?

  • Ever come across a flooded out stretch of road? (if the road has no lines and the water is still it can be very deceptive looking)

These are a tiny number of things I’ve encountered over the past few years. I’m sure plenty of other drivers can provide other good examples. I’d want to know how a self driving car would handle itself in situations like these.

[–] TopShelfVanilla@sh.itjust.works 0 points 2 years ago

How will the bot car handle itself out in the country? Dirt roads? Deer? Roadblock checkpoints full of bored, mean spirited cops.

[–] snooggums@kbin.social 5 points 2 years ago

Yes because each person must learn on their own and have limited experience relative to the general public as a whole.

Self driving cars can 'learn' from all self driving cars and don't get tired, forget, or anything like that. While they shouldn't be held to perfection, they should absolutely be held to a higher standard than a human.

[–] NeoNachtwaechter@lemmy.world 4 points 2 years ago

Should a self-driving car face more rigorous tests than actual human drivers? Honest question

First: none of these automated cars would pass a German driver's license test. By far.

Second: of course you cannot compare tests for humans with tests for machines.

[–] nxfsi@lemmy.world -1 points 2 years ago (3 children)

Only Tesla self driving cars need to have more rigorous tests. Other brands are fine as it is because they have lidar.

[–] IphtashuFitz@lemmy.world 6 points 2 years ago (1 children)

LiDAR isn’t some sort of magic eye. The self driving system is only as good as the software that takes the inputs from cameras, LiDAR, etc., processes them, and ensures safe operation of the car.

[–] nxfsi@lemmy.world 1 points 2 years ago

Finally someone who actually uses critical thinking instead of being an anti-Elon bandwagoner.

[–] skymtf@lemmy.blahaj.zone 2 points 2 years ago

I feel like all them do, have you seen wayze nearly getting black people killed cause it didn't stop for s cop. And it can't recognize construction zones.

[–] sky@codesink.io -1 points 2 years ago

Five LiDAR sensors hasn't stopped Cruise from running into a bus, multiple cars, and a fire truck. Maybe self-driving is a myth?

Maybe we should just build buses and trains and pay people good salaries to operate them??

load more comments (28 replies)