this post was submitted on 19 Mar 2025
1505 points (98.3% liked)

Not The Onion

15398 readers
1577 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] TommySoda@lemmy.world 439 points 1 week ago (18 children)

Notice how they're mad at the video and not the car, manufacturer, or the CEO. It's a huge safety issue yet they'd rather defend a brand that obviously doesn't even care about their safety. Like, nobody is gonna give you a medal for being loyal to a brand.

[–] can@sh.itjust.works 166 points 1 week ago (12 children)

These people haven't found any individual self identity.

An attack on the brand is an attack on them. Reminds me of the people who made Stars Wars their meaning and crumbled when a certain trilogy didn't hold up.

load more comments (12 replies)
load more comments (17 replies)
[–] FuglyDuck@lemmy.world 289 points 1 week ago (35 children)

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

This has been known.

They do it so they can evade liability for the crash.

load more comments (35 replies)
[–] user224@lemmy.sdf.org 192 points 1 week ago (16 children)

I wondered how the hell it managed to fool LIDAR, well...

The stunt was meant to demonstrate the shortcomings of relying entirely on cameras — rather than the LIDAR and radar systems used by brands and autonomous vehicle makers other than Tesla.

[–] Weirdfish@lemmy.world 164 points 1 week ago

If I could pass one law, requiring multiple redundant scanning tech on anything autonomous large enough to hurt me might be it.

I occasionally go to our warehouses which have robotic arms, autonomous fork lifts, etc. All of those have far more saftey features than a self driving Tesla, and they aren't in public.

[–] thesohoriots@lemmy.world 135 points 1 week ago (12 children)

The tl;dr here is that Elon said that humans have eyes and they work, and eyes are like cameras, so use cameras instead of expensive LIDAR. Dick fully inside car door for the slam.

load more comments (12 replies)
load more comments (14 replies)
[–] madcaesar@lemmy.world 162 points 1 week ago (59 children)

My 500$ robot vacuum has LiDAR, meanwhile these 50k pieces of shit don't 😂

load more comments (59 replies)
[–] comfy@lemmy.ml 142 points 1 week ago (23 children)

I hope some of you actually skimmed the article and got to the "disengaging" part.

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

[–] NikkiDimes@lemmy.world -2 points 1 week ago

I've heard that too, and I don't doubt it, but watching Mark Rober's video, it seems like he's deathgripping the wheel pretty hard before the impact which seems more likely to be disengaging. Each time, you can see the wheel tug slightly to the left, but his deathgrip pulls it back to the right.

load more comments (22 replies)
load more comments
view more: ‹ prev next ›