this post was submitted on 02 May 2025
178 points (98.4% liked)

memes

14543 readers
3934 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] SabinStargem@lemmy.today 10 points 2 days ago

The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.

/s.

[–] guywithoutaname@lemm.ee 20 points 2 days ago (2 children)

I'd imagine you are always responsible for what you do when you're driving, even if a system like autopilot is helping you drive.

[–] LemmyFeed@lemmy.dbzer0.com 14 points 2 days ago (1 children)

Especially cause autopilot disengages right before the accident so it's technically always your fault.

[–] arin@lemmy.world 5 points 2 days ago

Yup gotta read the fine print

[–] opus86@lemmy.today 5 points 2 days ago

If you are in the drivers seat, you are responsible for anything the car does unless there was a provable mechanical failure.

[–] ZILtoid1991@lemmy.world 22 points 2 days ago (1 children)

Except the autopilot will modify its data that it was turned off right at the moment it hits people...

[–] NikkiDimes@lemmy.world 18 points 2 days ago (2 children)

Nah, it just disengages a fraction of a second before impact so they can claim "it wasn't engaged at the moment of impact, so not our responsibility."

There were rumours about this for ages, but I honestly didn't fully buy it until I saw it in Mark Rober's vison vs lidar video and various other follow-ups to it.

[–] Tja@programming.dev 8 points 2 days ago

It not about responsibility, it's about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called "full self driving"). It's just a lane keeping assistant.

If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.

[–] xeekei@lemm.ee 4 points 2 days ago (2 children)
[–] NotMyOldRedditName@lemmy.world 10 points 2 days ago* (last edited 2 days ago) (2 children)

It turns off, but it's likely so the AEB system can kick in.

AP and AEB are separate things.

Also all L2 crashes that involve an air bag deployment or fatality get reported if it was on within something like 30s before hand, assuming the OEM has the data to report, which Tesla does.

Rules are changing to lessen when it needs to be reported, so things like fender benders aren't necessarily going to be reported for L2 systems in the near future, but something like this would still be and alway has.

[–] AnUnusualRelic@lemmy.world 1 points 2 days ago (1 children)

What's AEB? Automatic Energetic Braking?

[–] Kbobabob@lemmy.world 2 points 2 days ago

I'm guessing automatic emergency braking

[–] xeekei@lemm.ee 1 points 2 days ago (1 children)

Ok but if Tesla's using that report to get out from liability, we still've a damn problem

[–] NotMyOldRedditName@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

If it's a L2 system the driver is always liable. The report just makes sure we know it's happening and can force changes if patterns are found. The NHSTA made Tesla improve their driver monitoring based off the data since that was the main problem. The majority of accidents (almost all) were drunk or distracted drivers.

If it's a L4 system Tesla is always liable, we'll see that in June in Austin in theory for the first time on public roads.

The report never changes liability, it just let's us know what the state of the vehicle was for the incident. Tesla can't say the system was off because it was off 1 second before because we'll know it was on prior to that. But that doesn't change liability.

[–] Ageroth@reddthat.com 1 points 2 days ago

https://youtu.be/IQJL3htsDyQ

I'll admit I haven't actually watched the video yet but here it is

[–] randoot@lemmy.world 37 points 3 days ago (3 children)

Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.

[–] Tja@programming.dev 4 points 2 days ago

The driver is always at blame, even if it was on. They turn it off for marketing claims.

PS: fuck elon

[–] DarrinBrunner@lemmy.world 14 points 3 days ago (1 children)

I didn't know this, but I'm not shocked, or even a little bit surprised.

[–] Sonicdemon86@lemmy.world 14 points 2 days ago (1 children)

Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.

[–] randoot@lemmy.world 15 points 2 days ago (1 children)

This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was "off"

[–] anomnom@sh.itjust.works 2 points 2 days ago

It turns it off with the parking sensor 2ft before the accident.

[–] HonoraryMancunian@lemmy.world 5 points 2 days ago* (last edited 2 days ago) (1 children)

Holy shit I did indeed look it up, and it's true. Dunno if it'll hold up but it's still shady as shit

[–] JcbAzPx@lemmy.world 3 points 2 days ago (1 children)

Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you're not controlling it at that moment, you are expected to maintain safe operation.

That's why the Uber self driving car that killed someone was considered the test driver's fault and left Uber mostly off the hook.

Not sure how it works for the robo taxis, though.

[–] Allonzee@lemmy.world 1 points 2 days ago

Yeah that's gonna be tricky with those. I live in Vegas where they're already operating. No steering wheel at all.

[–] supersquirrel@sopuli.xyz 12 points 3 days ago* (last edited 3 days ago) (3 children)

Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.

The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.

Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.

[–] SkavarSharraddas@gehirneimer.de 2 points 3 days ago (1 children)

Which is why we need laws about human responsibility for decisions made by AI (or software in general).

[–] AnarchistArtificer@slrpnk.net 1 points 2 days ago

I did an internship at a bank way back, and my role involved a lot of processing of spreadsheets from different departments. I automated a heckton of that with Visual Basic, which my boss was okay with, but I was dismayed to learn that I wasn't saving anyone's time except my own, because after the internship was finished, all of the automation stuff would have to be deleted. The reason was because of a rule (I think a company policy rather than a law) that required that any code has to be the custody of someone, for accountability purposes — "accountability" in this case meaning "if we take unmaintained code for granted, then we may find an entire department's workflow crippled at some point in the future, with no-one knowing how it's meant to work".

It's quite a different thing than what you're talking about, but in terms of the implementation, it doesn't seem too far off.

[–] AnarchistArtificer@slrpnk.net 1 points 2 days ago

It reminds me of how apparently firing squad executions used to have only some of the guns loaded with live guns, and the rest with blanks. This way, the executioners could do some moral gymnastics to convince themselves that they hadn't just killed a person

[–] sexy_peach@feddit.org 12 points 3 days ago

Autopilot will turn off a few milliseconds before impact either way

[–] SaharaMaleikuhm@feddit.org 5 points 2 days ago (1 children)

In my country it's always your fault. And I'm very glad.

[–] Sauerkraut@discuss.tchncs.de 0 points 2 days ago (1 children)

Here in the US, even if the driver is found responsible it is often only a small fine ($500) and maybe a 30 day suspension for killing someone.

[–] Reygle@lemmy.world 5 points 2 days ago (1 children)

Made by someone who's able to THINK like a Tesla owner.

Brake pedal? Unthinkable

[–] Demdaru@lemmy.world 2 points 2 days ago

Dude, that's paid premium

[–] ZkhqrD5o@lemmy.world 2 points 2 days ago* (last edited 2 days ago) (17 children)

Tldr: Take the train and be safe.

Rant: In the EU, you are 35x more likely to die from a car crash, compared to a train crash. The union has created the so-called Vision Zero program, which is designed to reach zero driving deaths by some arbitrarily chosen date in the future. And of course it talks about autonomously driving cars. You know, crazy idea, but what if instead of we bet it all on some hypothetical magic Jesus technology that may or may not exist by the arbitrarily chosen date and instead focus on the real world solution that we already have? But well, the car industry investors would make less money, so I can answer that myself. :(

Edit: Also, Musk is a Nazi cunt who should die of cancer.

[–] Tja@programming.dev 2 points 2 days ago

Well, there is no train station at my house. Or Aldi. Or my kids Kindergarten. And I live Germany, where public transport is excellent on a global level (memes about Deutsche Bahn aside).

Cars will be necessary for the foreseeable future. Let's make them as safe as possible while investing in public transport, they are not mutually exclusive.

PS: fuck Elon.

[–] bleistift2@sopuli.xyz 2 points 2 days ago (1 children)

Speaking as a German: There are fewer train-related deaths because the trains don’t drive.

[–] ZkhqrD5o@lemmy.world 1 points 2 days ago

Well, we can thank Mr. Schröder for that. "Der Genosse der Bosse"

load more comments (15 replies)
[–] spankmonkey@lemmy.world 2 points 3 days ago (1 children)

At best Tesla pays a fine, not Elon.

[–] koper@feddit.nl 2 points 3 days ago

Don't worry, DOGE will just fire the investigators before that happens.

[–] HurlingDurling@lemm.ee 1 points 3 days ago

Slam on the brakes?

load more comments
view more: next ›