this post was submitted on 20 Nov 2025
412 points (99.0% liked)

Technology

76992 readers
3027 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] melsaskca@lemmy.ca -3 points 13 hours ago

This is just history repeating itself. A while ago it was typewriter repair persons vs. the keyboard. New tech won and time marched on. Having said that...fuck AI.

[–] resipsaloquitur@lemmy.world 18 points 1 day ago (1 children)

He also said the AI-generated code is often full of bugs. He cited one issue that occurred before his arrival that meant there was no session handling in his employer's application, so anybody could see the data of any organization using his company's software.

It’s only financial software, NBD.

[–] phoenixz@lemmy.ca 8 points 1 day ago

Well to be fair, financial data should be public, it would stop so many crimes, so much corruption.

Maybe AI saw the problems that hidden financial data causes and just decided to do the world a favor!

[–] HazardousBanjo@lemmy.world 18 points 1 day ago (2 children)

As per usual, those pushing for AI the most are the ones who don't fucking use it.

Is AI good for printing out the syntax, or an example of a library you haven't used before?

Sure, sometimes yes. Sometimes no.

Should it be a requirement to be a regular part of software development?

No. AI hallucinates very often and is imitative in nature, not innovative.

[–] chilicheeselies@lemmy.world 2 points 9 hours ago

More generally, noone should be required to do anything particular until it affects the team. Forcing people to work a certain way is beyond stupid.

[–] FreedomAdvocate -3 points 1 day ago (1 children)

And as per usual, those hating AI the most are the ones who don’t use it, don’t understand it, and/or hate it out of some misguided ideology.

Imitative is fine, great even in software development. You don’t need to reinvent the wheel. Programming languages/class libraries/etc all exist to give standard and functioning ways to do things the way they’re supposed to be done.

It’s funny that developers the world over absolutely loved and embraced tools like resharper, which was basically AI 0.5 for devs, yet now when AI is the evolution of that, everyone’s losing their mind.

Knowledge of AI tools absolutely will and should be a part of developer competencies that are evaluated during interviews in the near future, and that includes being able to explain why and when you would/would not use specific AI tools.

[–] HazardousBanjo@lemmy.world 10 points 23 hours ago (1 children)

And as per usual, those hating AI the most are the ones who don’t use it, don’t understand it, and/or hate it out of some misguided ideology.

I'm a software engineer and I use AI on a regular basis.

This shit isn't fit to take on the vast majority of jobs dipshit CEOs or the pseudointellectuals who fondle their balls claim they can.

Imitative is fine, great even in software development.

Fine as a tool for software engineers to figure out complications with understanding code syntax or generating an example of some not so complicated code.

It is fucking unreliable for full software development, which is what these tech oligarchs are trying to put it in charge of.

You don’t need to reinvent the wheel. Programming languages/class libraries/etc all exist to give standard and functioning ways to do things the way they’re supposed to be done.

And AI is shit at making full implementations of that, let alone objectively or even rationally testing itself. If it doesn't recognize an error in its own coding, why the hell would we trust it to recognize that error in testing?

It’s funny that developers the world over absolutely loved and embraced tools like resharper, which was basically AI 0.5 for devs, yet now when AI is the evolution of that, everyone’s losing their mind.

Because dumb fucks in power think AI is this magical tool that can do no wrong and do everything humans can do and better.

We are FAR AWAY from that being a reality for the reasons I already covered, and more.

Also, absolutely no company worth a damn has ever pushed anything from Resharper or AI to its millions of customers without human verification first. CEOS WANT TO ELIMINATE THAT HUMAN VERIFICATION! THATS A PROBLEM!

Knowledge of AI tools absolutely will and should be a part of developer competencies that are evaluated during interviews in the near future, and that includes being able to explain why and when you would/would not use specific AI tools.

Except, and I want you to pay close attention to this,

CEOS WANT TO TOTALLY ELIMINATE THE HUMAN FACTOR FROM SOFTWARE DEVELOPMENT ENTIRELY

Not partially

Not kinda sorta

ENTIRELY

Because they simply fundamentally do not understand what AI is, nor it's restrictions.

And it's very clear, you don't either.

[–] FreedomAdvocate 1 points 15 minutes ago

CEOs don’t matter, because they’re not the ones making these decisions - their CTOs and Architects and Lead Developers are. They’re the ones that tell the CEO what can and can’t be done. They’re the ones the CEOs interact with.

You think I don’t understand what AI is, not its restrictions? Based on what?

[–] python@lemmy.world 15 points 1 day ago (1 children)

I've been refusing to use any AI tools at all and luckily my manager respects that, even if he uses AI for basically everything he does. If the company ever decides to mandate it I'll just have the AI write all my code and commit it with no checks. With the worker's rights here, it'll take several months to fire me anyways.

[–] jjjalljs@ttrpg.network 12 points 1 day ago

Managers are often idiots in over their heads. AI is really aggravating that problem.

[–] phutatorius@lemmy.zip 6 points 1 day ago

My team have been trying it. So far, at best, it costs money but makes no difference in outcomes. Any productivity gains are wiped out by the time needed to diagnose and correct the errors it introduces.

I'd use Clippy before I use any of that time-wasting, unreliable, energy-guzzling crap.

[–] floofloof@lemmy.ca 151 points 2 days ago (1 children)

"We were still required to find some ways to use AI. The one corporate AI integration that was available to us was the Copilot plugin to Microsoft Teams. So everyone was required to use that at least once a week. The director of engineering checked our usage and nagged about it frequently in team meetings."

The managerial idiocy is astounding.

[–] gravitas_deficiency@sh.itjust.works 50 points 2 days ago (5 children)

It’s pretty easy to set up a cron job to fire off some sort of bullshit LLM request a handful of times a day during working hours. Just set it and forget it.

[–] acosmichippo@lemmy.world 34 points 2 days ago (1 children)

you could probably even get copilot to write it!

[–] brsrklf@jlai.lu 30 points 2 days ago (1 children)

"Prompt yourself with some bullshit so that it looks like you're doing something productive."

Who knows, maybe that's how you attain AGI? What is a more human kind of intelligence than looking for ways to be a lazy fuck?

[–] queerlilhayseed@piefed.blahaj.zone 23 points 2 days ago* (last edited 2 days ago)

Prompt an LLM to contemplate its own existence every 30 minutes, give it access to a database of its previous outputs on the topic, boom you've got a strange loop. IDK why everyone thinks AGI is so hard.

load more comments (4 replies)
[–] punrca@piefed.world 79 points 2 days ago (2 children)

The software engineer acknowledged that AI tools can help improve productivity if used properly, but for programmers with relatively limited experience, he feels the harm is greater than the benefit. Most of the junior developers at the company, he explained, don't remember the syntax of the language they're using due to their overreliance on Cursor.

Good luck for the future developers I guess.

companies that've spent money on AI enterprise licenses need to show some sort of ROI to the bean-counters. Hence, mandates.

Can't wait for AI bubble to pop. If this continues, expect more incidents/outages due to AI generated slop code in the future.

[–] FreedomAdvocate 2 points 23 hours ago

Future developers will still have to learn the basics. Calculators existing doesn’t mean people aren’t taught basic maths, does it?

[–] scarabic@lemmy.world 5 points 1 day ago* (last edited 1 day ago) (3 children)

From what I see, the current is beginning to turn a little toward valuing senior devs more than ever, because they can deal with the downsides of AI. Junior devs, on the other hand, cannot, and their simpler coding work is also more easily replaced by AI. So we’ll see fewer junior dev jobs, but seniors might do fine. I’m not sure that’s good news for the profession as a whole, but its been an extremely long gold rush into software and online services so some correction probably won’t be the end of the trade.

Oh and yes senior devs are still hounded to use AI, because it will get them further, faster. And there are no more junior devs to help. In the hands of a skilled dev, AI tools can be powerful, and they can spare some toil, and help them find their feet in less familiar frameworks and in foreign codebases.

[–] dreadbeef@lemmy.dbzer0.com 2 points 13 hours ago* (last edited 13 hours ago)

Code is the easiest thing as a dev. AI wont help me because Im already a good coder. Its the interconnectedness between services, dependencies in ownership (who do I talk to when a gateway error occurs vs a a 401 or 403 etc), etc that are the hard problems. Getting the right people together to solve the thing, you know? AI doesnt fix that.

[–] aesthelete@lemmy.world 25 points 1 day ago* (last edited 1 day ago) (2 children)

The problems in software still remain the same though:

(1) Bureaucracy

(2) Needless process

(3) Pointy headed managers

(4) Siloed teams

(5) Product people who have no idea what they want to build

(6) Shitty, poorly performing legacy code nobody wants to touch

Honestly, AI is just the latest thing that can boost your productivity at starting up some random app. But that was never the difficult part anyway.

[–] squaresinger@lemmy.world 17 points 1 day ago (1 children)

This, so much this.

When I think about what limited my performance in the last year it was mostly:

  • Having to get 5 signatures before I am allowed the budget to install some FOSS software on my work PC that the corporation has already approved for use on work PCs
  • Spending 8 months working on a huge feature that was scrapped after 8 months of development
  • Being told that no, we cannot work on another large feature request (of which there are many in the pipeline) because our team said we can only fit that scrapped feature into this year and we are not allowed to replan based on the fact that the feature we were supposed to work on got scrapped by business

And then they tell us to return to office and use AI for increasing efficiency.

It's all an elaborate play performed by upper management to feign being in control and being busy with something. Nobody is actually interested in producing a product, they all just want to further their own position.

load more comments (1 replies)
[–] scarabic@lemmy.world 1 points 1 day ago (1 children)

We are pushing our product managers to communicate their requirements with live prototypes rather than PRDs and mockups. It forces them to actually think their ideas through, and even allows them to get some hallway feedback before even bothering an eng. This might help with #5. But I’ve never had sympathy for engineers who think all the process around them is net negative, because nothings ever stopped engineers from striking out on their own, without all that, and making great businesses. If your PM and VPs are bringing you down, go it alone. If you can’t pull that together into a paycheck then maybe it’s not all as useless as some say.

[–] aesthelete@lemmy.world 4 points 1 day ago* (last edited 20 hours ago)

But I’ve never had sympathy for engineers who think all the process around them is net negative, because nothings ever stopped engineers from striking out on their own, without all that, and making great businesses.

Not all process is pointless, but needless process by definition is. There are also a shit ton of things that stop engineers from "striking out on their own".

If your PM and VPs are bringing you down, go it alone. If you can’t pull that together into a paycheck then maybe it’s not all as useless as some say.

The whole talk of "go[ing] it alone" kinda strikes me as "bootstrapping", libertarian non-sense.

I don't want to do marketing, sales, finance, legal, and product bullshit myself. That's why I'm an employee.

Two things can be true at the same time, for instance, a company can have a lot of bloated, needless process that stifles people and still pull in enough money to be able to pay for their employees to live a life.

With the amount of market concentration there is in every sector as far as the eye can see, nearly every software-producing company has a cash cow of some sort, and then has a bunch of complete money losers that are subsidized by that cash cow.

So, it's completely possible that the company overall fully sucks and hasn't developed anything new of value to someone in decades, but the legacy business keeps the miserable employees from the bread line.

To return to the point, AI doesn't solve any of this or even help with it.

[–] FreedomAdvocate -2 points 23 hours ago (1 children)

💯💯💯💯💯💯💯💯

A good developer learns the tools that are available and uses them appropriately. A bad developer refuses to learn new tools and will be replaced by someone who already did.

[–] phutatorius@lemmy.zip 3 points 13 hours ago (1 children)

And the appropriate use of some tools is to not use them at all.

[–] FreedomAdvocate 1 points 56 minutes ago

Absolutely - but you can’t know that without actually trying them.

[–] brsrklf@jlai.lu 102 points 2 days ago* (last edited 2 days ago) (1 children)

Nothing tells that AI is a clever use of your ressources like enforcing a mandatory AI query quota for your employees, and having them struggle to find anything it's good at and failing.

[–] WanderingThoughts@europe.pub 19 points 2 days ago

So, it's a DAI requirement

[–] jonathan7luke@lemmy.zip 45 points 2 days ago

For the FAANG companies, they do it in part so they can then turn around and make those flashy claims you see in headlines like "95% of ours devs use [insert AI product they are trying to sell] daily" or "60% of our code base is now 'written' by our fancy AI".

[–] Septimaeus@infosec.pub 34 points 2 days ago* (last edited 1 day ago) (5 children)

I’ll admit, some tools and automation are hugely improved with new ML smarts, but nothing feels dumber than hunting for problems to fit the boss’s pet solution.

load more comments (5 replies)
[–] BackgrndNoize@lemmy.world 24 points 2 days ago (2 children)

These scummy fucks even put it as a requirement in job descriptions these days

[–] MonkderVierte@lemmy.zip 30 points 2 days ago

This is a red flag for corpo culture shenanigans. Dodge the bullet.

[–] floofloof@lemmy.ca 10 points 2 days ago

What even is the requirement? "Must be able to ask a chatbot to do stuff"?

[–] supersquirrel@sopuli.xyz 19 points 2 days ago* (last edited 2 days ago) (7 children)

Then unionize! Nothing else will stop this.

load more comments
view more: next ›