this post was submitted on 01 Aug 2025
1094 points (99.0% liked)

Technology

73534 readers
2681 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

So, you admit that the company’s marketing has continued to lie for the past six years?

(page 3) 50 comments
sorted by: hot top controversial new old
[–] interdimensionalmeme@lemmy.ml 12 points 1 day ago* (last edited 1 day ago) (1 children)

"Today’s verdict is wrong"
I think a certain corporation needs to be reminded to have some humility toward the courts
Corporations should not expect the mercy to get away from saying the things a human would

load more comments (1 replies)
[–] drmoose@lemmy.world 8 points 1 day ago* (last edited 1 day ago) (1 children)

Seems like jury verdicts don't set a legal precedent in the US but still often considered to have persuasive impact on future cases.

This kinda makes sense but the articles on this don't make it very clear how impactful this actually is - here crossing fingers for Tesla's down fall. I'd imagine launching robo taxis would be even harder now.

It's funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let's say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?

[–] bluGill@fedia.io 7 points 1 day ago (1 children)

What jury results do is cost real money - companies often (not always) change in hopes to avoid more.

[–] drmoose@lemmy.world 2 points 1 day ago* (last edited 1 day ago) (1 children)

Yeah but also how would this work at full driving scale. If 1,000 cases and 100 are settled for 0.3 billion that's already 30 billion a year, almost a quarter of Tesla's yearly revenue. Then in addition, consider the overhead of insurance fraud etc. It seems like it would be completely legally unsustainable unless we do "human life costs X number of money, next".

I genuinely think we'll be stuck with humans for a long time outside of highly controlled city rides like Wayno where the cars are limited to 40km hour which makes it very difficult to kill anyone either way.

[–] bluGill@fedia.io 2 points 1 day ago (1 children)

We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.

[–] drmoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

No the issue still remains on who's actually responsible? With human drivers we always have someone to take the blame but with robots? Who's at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it'll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.

Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.

[–] bluGill@fedia.io 1 points 18 hours ago (1 children)

That is an issue.

i just realized that I didn't finish the thought. Once self driving is statistically safer we will ban human drivers. Some places it will be by law, Some the more subtile insurance costs, some by something else.

We need to figure out liability of course. I have ideas but nobody will listen so noebuint in writting.

load more comments (1 replies)
[–] NauticalNoodle@lemmy.ml 5 points 1 day ago* (last edited 1 day ago) (2 children)

I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...

[–] ChickenLadyLovesLife@lemmy.world 2 points 21 hours ago

Whoa there, pardner. Boeing has people murdered when they go against the company. Tesla only kills customers (so far, at least).

load more comments (1 replies)
[–] partial_accumen@lemmy.world 7 points 1 day ago (5 children)

Don't take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

I had to find two other articles to figure out if the system being used here was Tesla's free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that's true, Tesla has positioned its cars as being highly autonomous, and often times doesn't call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

So I DO blame Tesla, even if the driver contributed to the accident.

[–] TranscendentalEmpire@lemmy.today 9 points 1 day ago (1 children)

I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There's a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

[–] partial_accumen@lemmy.world 3 points 1 day ago (1 children)

I feel like calling it AutoPilot is already risking liability,

From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft's control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models

Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don't share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn't. I agree with you that it was a legal liability waiting to happen.

[–] Auli@lemmy.ca -1 points 19 hours ago (1 children)

So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

load more comments (1 replies)
[–] NotMyOldRedditName@lemmy.world 5 points 1 day ago* (last edited 1 day ago)

FSD wasn't even available (edit to use) in 2019. It was a future purchase add on that only went into very limited invite only beta in 2020.

In 2019 there was much less confusion on the topic.

load more comments (3 replies)
load more comments
view more: ‹ prev next ›