this post was submitted on 23 Dec 2025
821 points (97.6% liked)
Technology
77925 readers
2863 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI code is great for getting over a hump, something you're stuck on. Used ChatGPT (not the best for coding, I know) to help on a PowerShell script. There was exactly two references on the internet for what I wanted to do (Google Calendar/Sheets integration). Spent hours on the problem.
ChatGPT gave me two things: One solution I didn't know was a thing, another was a twist I hadn't thought of. For giggles, I plugged the whole script in. Guess what? Failed instantly. Because of course it did.
No. LLMs don't write working code. Yes. They can help you, assuming you know what you're doing in the first place. But here's the crux of using AI:
It does not, and cannot, give a shit about edge cases, user error and security.
I wrote a simple PS script to swap my TV screens around for work, play and movies. Rolled it out in 30 minutes. Took me 2 more hours to stupid proof it, test it, wrap it an exe, make an icon, deploy it, all that. AI can't do any of that.