this post was submitted on 01 Aug 2025
1163 points (99.1% liked)

Technology

73567 readers
4091 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

So, you admit that the company’s marketing has continued to lie for the past six years?

you are viewing a single comment's thread
view the rest of the comments
[–] Showroom7561@lemmy.ca 36 points 1 day ago (6 children)

Good that the car manufacturer is also being held accountable.

But...

In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

That's on him. 100%

McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

[–] bier@feddit.nl 8 points 14 hours ago

It is assistive technology, but that is not how tesla has been marketing it. They even sell a product called full self driving, while it's not that at all.

[–] freddydunningkruger@lemmy.world 9 points 15 hours ago* (last edited 15 hours ago)

I dig blaming the people who wind up believing deceptive marketing practices, instead of blaming the people doing the deceiving.

Look up the dictionary definition of autopilot: a mechanical, electrical or hydraulic system used to guide a vehicle without assistance from a human being. FULL SELF DRIVING, yeah, why would that wording lead people to believe the car was, you know, fully self-driving?

Combine that with year after year of Elon Musk constantly stating in public that the car either already drives itself, or will be capable of doing so just around the corner, by the end of next year, over and over and over and

Elon lied constantly to keep the stock price up, and people have died for believing those lies.

[–] some_guy@lemmy.sdf.org 23 points 1 day ago (3 children)

Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.

[–] Showroom7561@lemmy.ca 6 points 18 hours ago (1 children)

Absolutely. I hope he and the company burn in hell, but I do not want to start giving drivers who kill people a free pass to say "well, it was the car's fault!"

"Autopilot", especially in Tesla cars, is beta software at best, and this feature should never have been allowed to be used on public roads. In that sense, the transportation ministry that's allowed it also has blood on their hands.

[–] Keelhaul@sh.itjust.works 2 points 14 hours ago

Woo, both parties are terrible, irresponsible, and should be held accountable

[–] febra@lemmy.world 17 points 1 day ago (1 children)

Well, if only Tesla hadn't invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don't read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.

[–] Showroom7561@lemmy.ca -2 points 18 hours ago (1 children)

I will repeat, regardless of what the (erroneous) claims are by Tesla, a driver is still responsible.

This is like those automated bill payment systems. Sure, they are automated, and the company promotes it as "easy" and "convenient", but you're still responsible if those bills don't get paid for whatever reason.

From another report:

While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

Isn't using a phone while being the driver of a vehicle illegal? And what the hell is was up with highway speeds near an intersection??? This dude can blame autopilot, but goddamn, he was completely negligent. It's like there were two idiots driving the same vehicle that day.

[–] febra@lemmy.world 2 points 15 hours ago

Yes, of course the driver is at fault for being an idiot. And sadly, a shitton of drivers are idiots. Ignoring this fact is practically ignoring reality. You shouldn't be allowed to do false marketing as a company exactly because idiots will fall for it.

[–] tylerkdurdan@lemmy.world 19 points 1 day ago (1 children)

i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies

[–] Showroom7561@lemmy.ca 1 points 1 day ago (2 children)

Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.

I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.

[–] limelight79@lemmy.world 6 points 1 day ago

Here's my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.

If there's some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:

But the responsibility for safe driving, is on the driver...

And how do we know there isn't some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems...ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can't fucking disable?

But the responsibility for safe driving, is on the driver...

In other words, "We bear no responsibility!" So, I have to pay for these "features" and the manufacturer will deny any responsibility if one of them fails and causes a crash. It's always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don't even trust it enough to stand behind it.

Maybe you'll get lucky and enough issues will happen that gov't regulators will look into it (not in the US any more, of course)...but probably not. You'll be blamed, and you'll pay higher insurance, and that will be that.

So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this "smart" technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I've found I hate having to fight my own car.

Obviously, I very much dislike driving our newer car. It's primarily my wife's car, and I only drive it once or twice a week, fortunately.

[–] tylerkdurdan@lemmy.world -2 points 1 day ago

agree with you here. your point reminds me of this case below. The tldr is pilots were using their laptop to look at scheduled iirc and overflew their destination. its long been speculated they were watching a movie

https://en.m.wikipedia.org/wiki/Northwest_Airlines_Flight_188