"Today’s verdict is wrong"
I think a certain corporation needs to be reminded to have some humility toward the courts
Corporations should not expect the mercy to get away from saying the things a human would
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Seems like jury verdicts don't set a legal precedent in the US but still often considered to have persuasive impact on future cases.
This kinda makes sense but the articles on this don't make it very clear how impactful this actually is - here crossing fingers for Tesla's down fall. I'd imagine launching robo taxis would be even harder now.
It's funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let's say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?
What jury results do is cost real money - companies often (not always) change in hopes to avoid more.
Yeah but also how would this work at full driving scale. If 1,000 cases and 100 are settled for 0.3 billion that's already 30 billion a year, almost a quarter of Tesla's yearly revenue. Then in addition, consider the overhead of insurance fraud etc. It seems like it would be completely legally unsustainable unless we do "human life costs X number of money, next".
I genuinely think we'll be stuck with humans for a long time outside of highly controlled city rides like Wayno where the cars are limited to 40km hour which makes it very difficult to kill anyone either way.
We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.
Don't take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.
I had to find two other articles to figure out if the system being used here was Tesla's free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.
There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.
Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that's true, Tesla has positioned its cars as being highly autonomous, and often times doesn't call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.
So I DO blame Tesla, even if the driver contributed to the accident.
I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There's a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.
I feel like calling it AutoPilot is already risking liability,
From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft's control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models
Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.
I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don't share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn't. I agree with you that it was a legal liability waiting to happen.
FSD wasn't even available (edit to use) in 2019. It was a future purchase add on that only went into very limited invite only beta in 2020.
In 2019 there was much less confusion on the topic.
Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?
Going off of OP's quote, the jury found the driver responsible but Tesla is found liable, which is pretty confusing. It might make some sense if expected autopilot functionality despite the drivers foot being on the pedal didn't work.
Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?
From the article, it looks like the car didn't even try to stop because the driver was overridden by the acceleration because the driver had their foot pressed on the pedal (which isn't normal during autopilot use).
This is correct. And when you do this, the car tells you it won't brake.
I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...