this post was submitted on 19 Nov 2025
293 points (99.3% liked)

Technology

76949 readers
3514 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Around the same time, Cloudflare’s chief technology officer Dane Knecht explained that a latent bug was responsible in an apologetic X post.

“In short, a latent bug in a service underpinning our bot mitigation capability started to crash after a routine configuration change we made. That cascaded into a broad degradation to our network and other services. This was not an attack,” Knecht wrote, referring to a bug that went undetected in testing and has not caused a failure.

you are viewing a single comment's thread
view the rest of the comments
[–] PiraHxCx@lemmy.ml 53 points 3 days ago (4 children)

I wonder if all recent outages aren't just crappy AI coding

[–] MagicShel@lemmy.zip 109 points 2 days ago (5 children)

Shitty code has been around far longer than AI. I should know, I wrote plenty of it.

[–] floofloof@lemmy.ca 30 points 2 days ago (1 children)

They trained it on the work of people like you.

[–] MagicShel@lemmy.zip 29 points 2 days ago

Shame on them. I mark my career by how long it takes me to regret the code I write. When I was a junior, it was often just a month or two. As I seasoned it became maybe as long as two years. Until finally i don't regret my code, only the exigencies that prevented me from writing better.

[–] wreckedcarzz@lemmy.world 15 points 2 days ago

The AI was the shitty code we wrote along the way

[–] mycodesucks@lemmy.world 7 points 2 days ago

Now... I don't like to brag...

[–] tdawg@lemmy.world 7 points 2 days ago (1 children)

I too have looked at my earliest repos in dispair

[–] FauxLiving@lemmy.world 2 points 2 days ago (1 children)

It's always depressing when you ask the AI to explain your code and then you get banned from OpenAI

[–] 123@programming.dev 2 points 2 days ago (1 children)

Who didn't get hit by the fork bug the professor explicitly asked you to watch out for since it would (back then with windows systems being required to use the campus resources) require an admin with Linux access to eliminate.

It was kind of fun walking in to the tech support area and them asking your login name with no context knowing what the issue was. Must have been a common occurrence that week of the course.

[–] FauxLiving@lemmy.world 2 points 2 days ago

It was kind of fun walking in to the tech support area and them asking your login name with no context knowing what the issue was.

I see this zip bomb was owned by user icpenis, someone track that guy down.

[–] foo@feddit.uk 2 points 2 days ago

But, AI can do the work of 10 of you humans, so it can write 10 times the bugs and deploy them to production 10 times faster. Especially if pesky testers stay out the way instead of finding some of the bugs.

[–] AbidanYre@lemmy.world 22 points 3 days ago (2 children)

Humans are plenty capable of writing crappy code without needing to blame AI.

[–] KazuyaDarklight@lemmy.world 10 points 2 days ago

Absolutely, but it does feel like things have spiked a bit recently.

[–] tonytins@pawb.social 6 points 2 days ago

Train on shitty code, get shitty code. Garbage in. Garbage out.

[–] ThePantser@sh.itjust.works 15 points 3 days ago

AI coding, AI compiling, AI bug testing, AI users, etc.

Indirectly, this was. He said this was a bug in their recent tool that allows sites to block AI crawlers that caused the outages. It’s a relatively new tool released in the last few months, so it makes sense it might be buggy as the rush to stop the AI DoS attacks has been pertinent.