this post was submitted on 11 Jul 2025
404 points (97.4% liked)

Technology

72932 readers
3824 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] Feyd@programming.dev 48 points 1 week ago (2 children)

Fun how the article concludes that AI tools are still good anyway, actually.

This AI hype is a sickness

load more comments (2 replies)
[–] neclimdul@lemmy.world 47 points 1 week ago (7 children)

Explain this too me AI. Reads back exactly what's on the screen including comments somehow with more words but less information Ok....

Ok, this is tricky. AI, can you do this refactoring so I don't have to keep track of everything. No... Thats all wrong... Yeah I know it's complicated, that's why I wanted it refactored. No you can't do that... fuck now I can either toss all your changes and do it myself or spend the next 3 hours rewriting it.

Yeah I struggle to find how anyone finds this garbage useful.

[–] SpaceCowboy@lemmy.ca 19 points 1 week ago (2 children)

You shouldn't think of "AI" as intelligent and ask it to do something tricky. The boring stuff that's mostly just typing, that's what you get the LLMs to do. "Make a DTO for this table " "Interface for this JSON "

I just have a bunch of conversations going where I can paste stuff into and it will generate basic code. Then it's just connecting things up, but that's the fun part anyway.

load more comments (2 replies)
[–] Damaskox@lemmy.world 7 points 1 week ago (4 children)

I have asked questions, had conversations for company and generated images for role playing with AI.

I've been happy with it, so far.

load more comments (4 replies)
load more comments (5 replies)
[–] resipsaloquitur@lemmy.world 29 points 1 week ago (2 children)

Writing code is the easiest part of my job. Why are you taking that away?

load more comments (2 replies)
[–] desmosthenes@lemmy.world 28 points 1 week ago

no shit. ai will hallucinate shit I’ll hit tab by accident and spend time undoing that or it’ll hijack tab on new lines inconsistently

[–] astronaut_sloth@mander.xyz 25 points 1 week ago (11 children)

I study AI, and have developed plenty of software. LLMs are great for using unfamiliar libraries (with the docs open to validate), getting outlines of projects, and bouncing ideas for strategies. They aren't detail oriented enough to write full applications or complicated scripts. In general, I like to think of an LLM as a junior developer to my senior developer. I will give it small, atomized tasks, and I'll give its output a once over to check it with an eye to the details of implementation. It's nice to get the boilerplate out of the way quickly.

Don't get me wrong, LLMs are a huge advancement and unbelievably awesome for what they are. I think that they are one of the most important AI breakthroughs in the past five to ten years. But the AI hype train is misusing them, not understanding their capabilities and limitations, and casting their own wishes and desires onto a pile of linear algebra. Too often a tool (which is one of many) is being conflated with the one and only solution--a silver bullet--and it's not.

This leads to my biggest fear for the AI field of Computer Science: reality won't live up to the hype. When this inevitably happens, companies, CEOs, and normal people will sour on the entire field (which is already happening to some extent among workers). Even good uses of LLMs and other AI/ML use cases will be stopped and real academic research drying up.

[–] 5too@lemmy.world 33 points 1 week ago (2 children)

My fear for the software industry is that we'll end up replacing junior devs with AI assistance, and then in a decade or two, we'll see a lack of mid-level and senior devs, because they never had a chance to enter the industry.

[–] squaresinger@lemmy.world 16 points 1 week ago (3 children)

That's happening right now. I have a few friends who are looking for entry-level jobs and they find none.

It really sucks.

That said, the future lack of developers is a corporate problem, not a problem for developers. For us it just means that we'll earn a lot more in a few years.

[–] 5too@lemmy.world 7 points 1 week ago (2 children)

You're not wrong, and I feel like it was a developing problem even before AI - everybody wanted someone with experience, even if the technology was brand new.

That said, even if you and I will be fine, it's still bad for the industry. And even if we weren't the ones pulling up the ladder behind us, I'd still like to find a way to start throwing ropes back down for the newbies...

[–] CosmicTurtle0@lemmy.dbzer0.com 8 points 1 week ago (1 children)

They wanted someone with experience, who can hit the ground running, but didn't want to pay for it, either with cash or time.

  • cheap
  • quick
  • experience

You can only pick two.

load more comments (1 replies)
[–] squaresinger@lemmy.world 5 points 1 week ago* (last edited 1 week ago) (2 children)

You're not wrong, and I feel like it was a developing problem even before AI - everybody wanted someone with experience, even if the technology was brand new.

True. It was a long-standing problem that entry-level jobs were mostly found in dodgy startups.

Tbh, I think the biggest issue right now isn't even AI, but the economy. In the 2010s we had pretty much no intrest rate at all while having a pretty decent economy, at least for IT. The 2008 financial crisis hardly mattered for IT, and Covid was a massive boost for IT. There was nothing else to really spend money on.

IT always has more projects than manpower, so with enough money to spend, they just hired everyone.

But the sanctions against Russia in response to their invasion of Ukraine really hit the economy and rising intrest rates to combat inflation meant that suddenly nobody wanted to invest anymore.

With no investments, startups dried up and large corporations also want to downsize. It's no coincidence that return-to-work mandates only started after the invasion and not in the two years prior of that where lockdowns were already revoked. Work from home worked totally fine for two years after covid lockdowns, and companies even praised how well it worked.

Same with AI. While it can improve productivity in some edge cases, I think it's mostly a scapegoat to make mass-fireings sound like a great thing to investors.

That said, even if you and I will be fine, it's still bad for the industry. And even if we weren't the ones pulling up the ladder behind us, I'd still like to find a way to start throwing ropes back down for the newbies...

You are totally right with that, and any chance I get I will continue to push for hiring juniors.

But I am also over corporate tears. For decades they have been crying over a lack of skilled workers in the IT and pushing for more and more people to join IT, so that they can dump wages, and as soon as the economy is bad, they instantly u-turn and dump employees.

If corporations want to be short-sighted and make people suffer for it, they won't get compassion from me when it fails.

Edit: Remember, we are not the ones pulling the ladder up.

load more comments (2 replies)
[–] Feyd@programming.dev 3 points 1 week ago (3 children)

I would say that "replacing with AI assistance" is probably not what is actually happening. Is it economic factors reducing hiring. This isn't the first time it has happened and it won't be the last. The AI boosters are just claiming responsibility for marketing purposes.

load more comments (3 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] bassomitron@lemmy.world 3 points 1 week ago (1 children)

Couldn't have said it better myself. The amount of pure hatred for AI that's already spreading is pretty unnerving when we consider future/continued research. Rather than direct the anger towards the companies misusing and/or irresponsibly hyping the tech, they direct it at the tech itself. And the C Suites will of course never accept the blame for their poor judgment so they, too, will blame the tech.

Ultimately, I think there are still lots of folks with money that understand the reality and hope to continue investing in further research. I just hope that workers across all spectrums use this as a wake up call to advocate for protections. If we have another leap like this in another 10 years, then lots of jobs really will be in trouble without proper social safety nets in place.

[–] Feyd@programming.dev 14 points 1 week ago

People specifically hate having tools they find more frustrating than useful shoved down their throat, having the internet filled with generative ai slop, and melting glaciers in the context of climate change.

This is all specifically directed at LLMs in their current state and will have absolutely zero effect on any research funding. Additionally, openAI etc would be losing less money if they weren't selling (at a massive loss) the hot garbage they're selling now and focused on research.

As far as worker protections, what we need actually has nothing to do with AI in the first place and has everything to do with workers/society at large being entitled to the benefits of increased productivity that has been vacuumed up by greedy capitalists for decades.

[–] stsquad@lemmy.ml 3 points 1 week ago

They can be helpful when using a new library or development environment which you are not familiar with. I've noticed a tendency to make up functions that arguably should exist but often don't.

load more comments (8 replies)
[–] xep@fedia.io 23 points 1 week ago (4 children)

Code reviews take up a lot of time, and if I know a lot of code in a review is AI generated I feel like I'm obliged to go through it with greater rigour, making it take up more time. LLM code is unaware of fundamental things such as quirks due to tech debt and existing conventions. It's not great.

load more comments (4 replies)
[–] FancyPantsFIRE@lemmy.world 16 points 1 week ago (1 children)

I’ve used cursor quite a bit recently in large part because it’s an organization wide push at my employer, so I’ve taken the opportunity to experiment.

My best analogy is that it’s like micro managing a hyper productive junior developer that somehow already “knows” how to do stuff in most languages and frameworks, but also completely lacks common sense, a concept of good practices, or a big picture view of what’s being accomplished. Which means a ton of course correction. I even had it spit out code attempting to hardcode credentials.

I can accomplish some things “faster” with it, but mostly in comparison to my professional reality: I rarely have the contiguous chunks of time I’d need to dedicate to properly ingest and do something entirely new to me. I save a significant amount of the onboarding, but lose a bunch of time navigating to a reasonable solution. Critically that navigation is more “interrupt” tolerant, and I get a lot of interrupts.

That said, this year’s crop of interns at work seem to be thin wrappers on top of LLMs and I worry about the future of critical thinking for society at large.

[–] Feyd@programming.dev 9 points 1 week ago* (last edited 1 week ago)

That said, this year’s crop of interns at work seem to be thin wrappers on top of LLMs and I worry about the future of critical thinking for society at large.

This is the must frustrating problem I have. With a few exceptions, LLM use seems to be inversely proportional to skill level, and having someone tell me "chatgpt said ___" when asking me for help because clearly chatgpt is not doing it for their problem makes me want to just hang up.

[–] BrianTheeBiscuiteer@lemmy.world 12 points 1 week ago (5 children)

Just the other day I wasted 3 min trying to get AI to sort 8 lines alphabetically.

[–] bassomitron@lemmy.world 4 points 1 week ago (2 children)

By having it write a quick function to do so or to sort them alphabetically within the chat? Because I've used GPT to write boilerplate and/or basic functions for random tasks like this numerous times without issue. But expecting it to sort a block of text for you is not what LLMs are really built for.

That being said, I agree that expecting AI to write complex and/or long-form code is a fool's hope. It's good for basic tasks to save time and that's about it.

[–] doxxx@lemmy.ca 3 points 1 week ago* (last edited 1 week ago)

I’ve actually had a fair bit of success getting GitHub Copilot do things like this. Heck I even got it to do some matrix transformations of vectors in a JSON file.

load more comments (1 replies)
load more comments (4 replies)
[–] worldistracist@lemmy.cafe 6 points 1 week ago

Great! Less productivity = more jobs, more work security.

load more comments
view more: ‹ prev next ›