this post was submitted on 19 Mar 2025
1286 points (98.4% liked)

Not The Onion

15173 readers
2543 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

top 50 comments
sorted by: hot top controversial new old
[–] get_the_reference_@midwest.social 15 points 5 hours ago (1 children)

E. Lon Musk. Supah. Geenius.

[–] emberpunk@lemmy.ml 9 points 4 hours ago
[–] comfy@lemmy.ml 52 points 8 hours ago (4 children)

I hope some of you actually skimmed the article and got to the "disengaging" part.

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

[–] cortex7979@lemm.ee 12 points 2 hours ago

That's so wrong holy shit

[–] LemmyFeed@lemmy.dbzer0.com 11 points 7 hours ago (5 children)

Don't get me wrong, autopilot turning itself off right before a crash is sus and I wouldn't put it past Tesla to do something like that (I mean come on, why don't they use lidar) but maybe it's so the car doesn't try to power the wheels or something after impact which could potentially worsen the event.

On the other hand, they're POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.

[–] FiskFisk33@startrek.website 7 points 1 hour ago (1 children)

if it can actually sense a crash is imminent, why wouldn't it be programmed to slam the brakes instead of just turning off?

Do they have a problem with false positives?

[–] Whelks_chance@lemmy.world 2 points 1 hour ago (1 children)

I've been wondering this for years now. Do we need intelligence in crashes, or do we just need vehicles to stop? I think you're right, it must have been slamming the brakes on at unexpected times, which is unnerving when driving I'm sure.

[–] alcoholicorn@lemmy.ml 4 points 55 minutes ago* (last edited 51 minutes ago) (1 children)

So they had an issue with the car slamming on the brakes at unexpected times, caused by misidentifying cracks in the road or glare or weird lighting or w/e. The solution was to make the cameras ignore anything they can't recognize at high speeds. This resulted in Teslas plowing into the back of firetrucks.

As the article mentioned, other self-driving cars solved that with lidar, which elon himself is against because he says AI will just get so good and 2d cameras are cheaper.

[–] OsrsNeedsF2P@lemmy.ml 1 points 50 minutes ago (1 children)

This is from 6 years ago. I haven't heard of the issue more recently

[–] alcoholicorn@lemmy.ml 1 points 14 minutes ago* (last edited 10 minutes ago)

https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/

The tesla did not consistently detect that the thing infront of it was a truck, so it didn't brake. Also, this describes a lot of similar cases.

I remember a youtuber doing similar tests, where they'd try to run over a fake pedestrian crossing or standing in the road at low speed, and then high speed. It would often stop at low speed, but very rarely stopped or swerved at high speed.

[–] Krzd@lemmy.world 4 points 3 hours ago

Wouldn't it make more sense for autopilot to brake and try to stop the car instead of just turning off and letting the car roll? If it's certain enough that there will be an accident, just applying the brakes until there's user override would make much more sense..

[–] skuzz@discuss.tchncs.de 6 points 6 hours ago

Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.

[–] T156@lemmy.world 3 points 5 hours ago

Rober seems to think so, since he says in the video that it's likely disengaging because the parking sensors detect that it's parked because of the object in front, and it shuts off the cruise control.

[–] Tungsten5@lemm.ee 5 points 6 hours ago

I see your point, and it makes sense, but I would be very surprised if Tesla did this. I think the best option would be to turn off the features once an impact is detected. It shutting off before hand feels like a cheap ploy to avoid guilt

[–] Tungsten5@lemm.ee 3 points 6 hours ago* (last edited 6 hours ago)

It always is that way; fuck the consumer, its all about making a buck

load more comments (1 replies)
[–] buddascrayon@lemmy.world 41 points 12 hours ago (1 children)

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

So, who's the YouTuber that's gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.

[–] bay400@thelemmy.club 10 points 8 hours ago

It basically already happened in the Mark Rober video, it turns off by itself less than a second before hitting

[–] madcaesar@lemmy.world 108 points 14 hours ago (19 children)

My 500$ robot vacuum has LiDAR, meanwhile these 50k pieces of shit don't 😂

[–] rbm4444@lemmy.world 17 points 9 hours ago

Holy shit, I knew I'd heard this word before. My Chinese robot vacuum cleaner has more technology than a tesla hahahahaha

load more comments (18 replies)
[–] pineapplelover@lemm.ee 25 points 11 hours ago (4 children)

To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn't rely on cameras

[–] utopiah@lemmy.world 4 points 3 hours ago* (last edited 3 hours ago)

I'd take that bet. I imagine at least some drivers would notice something sus' (due to depth perception, which should be striking as you get close, or lack of ANY movement or some kind of reflection) and either

  • slow down
  • use a trick, e.g. flicking lights or driving a bit to the sides and back, to try to see what's off

or probably both, but anyway as other already said, it's being compared to other autopilot systems, not human drivers.

[–] FuglyDuck@lemmy.world 19 points 7 hours ago* (last edited 7 hours ago)

To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.

this isn't being fair. It's being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.

Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.

[–] comfy@lemmy.ml 9 points 7 hours ago* (last edited 7 hours ago) (1 children)

The video does bring up human ability too with the fog test ("Optically, with my own eyes, I can no longer see there's a kid through this fog. The lidar has no issue.") But, as they show, this wall is extremely obvious to the driver.

[–] pineapplelover@lemm.ee 7 points 6 hours ago (1 children)

The tesla would lose its shit if it sees this

[–] T156@lemmy.world 3 points 5 hours ago

They already have trouble enough with trucks carrying traffic lights, or with speed limit drivers on them.

load more comments (1 replies)
load more comments
view more: next ›