this post was submitted on 26 Apr 2026
750 points (99.0% liked)

Not The Onion

21336 readers
1158 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, ableist, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] dan@upvote.au 7 points 2 days ago* (last edited 2 days ago) (5 children)

Honestly, I'm not sure what's worse: the current state of things (severely overworked air traffic controllers because there's huge shortages), or using AI for it.

The plane that crashed into a fire truck at LGA recently was mostly due to overworked ATC. One controller was working both ground and tower, with queues of five or six planes needing to land. He needed to continue working for half an hour after the crash too, because nobody else was around to take over.

I'm not sure I'd trust ATC to be fully automated using AI, but AI tools could probably help controllers by reducing the amount of work they have to do. For example, smarter RADAR, recommendations for what to call next, more proactive warnings for if anything dangerous is likely to occur, etc.

[–] takeda@lemmy.dbzer0.com 3 points 1 day ago

The TV show host is talking about using fucking chat bots, that we started calling AI.

[–] FlashMobOfOne@lemmy.world 19 points 2 days ago (2 children)

I’m not sure I’d trust ATC to be fully automated using AI

When generative AI can't even answer simple questions correctly, it feels like maybe it's too imprecise a tool to trust with this level of necessary precision.

And it's a shame. The US has the money to pay these people.

[–] crank0271@lemmy.world 9 points 2 days ago

The US has the money to pay these people.

Incorrect. The US only has money to start and continue wars of aggression and to terrorize (brown) civilians, and to siphon into the coffers of its oligarchs.

[–] dan@upvote.au 1 points 1 day ago* (last edited 1 day ago) (1 children)

When generative AI can't even answer simple questions correctly, it feels like maybe it's too imprecise a tool to trust with this level of necessary precision.

I don't think this would be generative AI though. Machine learning is probably a better fit - training a model based on recordings of human air traffic controllers.

[–] FlashMobOfOne@lemmy.world 1 points 1 day ago

I agree.

The problem is that AI companies are not responsible actors, even in when it comes to working with the government. We saw that with the use of Claude in Iran.

They will surely claim they've acted responsibly in building whatever product they'll roll out in front of DOT, but I've seen more than enough shitty, cobbled-together product from them to believe that they'll take shortcuts and fail to do the work needed to make it function properly.

[–] sp3ctr4l@lemmy.dbzer0.com 13 points 2 days ago* (last edited 2 days ago) (1 children)

Another part of the reason that crash occurred was because the firetrucks didn't have transponders for the system that is supposed to be aware of all things that are, or about to be, on a particular part of the airfield.

If they had had them, this would have triggered with the system thats hooked into the radar detection of oncoming aircraft, that the ATCs did have, it would have started barking out warnings.

(ASDE-X is the specific system I'm talking about)

So... if you just ... plug in AI... to hardware sensors that dont actually exist... well they're gonna miss things too.

Kinda like how... it doesn't matter how much compute power Elon crams into a Tesla, the Autopilot based on visual cameras alone will be inferior to an Autopilot that also uses LIDAR.

(My source on this is youtuber Captain Steeeve, retired pilot, goes through the latest NTSB report)

[–] dan@upvote.au 2 points 1 day ago (1 children)

You're absolutely right!

(sorry)

[–] sp3ctr4l@lemmy.dbzer0.com 3 points 1 day ago* (last edited 1 day ago)

Its ok lol, this whole catastrophe was so complex that the NTSB ... seems like it had to redo its whole report, or maybe a better way to say it would be that their initial report was incomplete, and the later report hsd a loooot more, and some just actually different analysis and conclusions.

There were many, many contributing factors to this.

ATC was overworked, made a mistake.

Later tried to correct it, but it likely wasn't heard because a huge truck hauling water... well its diesel engines spooling up are very loud, inside the cabin.

Systems... didn't specifically fail, they just didn't work correctly, due to not being fully implemented.

The drivers of the truck could have paid attention to the red strip of lights infront of them, instead of ignoring them - its possible that if they asked ATC 'hey why are the DONT GO lights on, ATC?', the ATC might have looked at the system controlling that and seen 'oh, the lights are red because a plane will be landing in 45 seconds'.

Fustercluck.

[–] Tollana1234567@lemmy.today 1 points 1 day ago

AI still to untrustworthy to manage that, having 1 atc isnt enough even with AI.

[–] kunaltyagi@programming.dev 1 points 2 days ago

US doesn't have money to automate most of the manual bookkeeping that is done in most ATC. There's no way it'll shell out for smarter RADAR and shit