this post was submitted on 09 Feb 2026
176 points (99.4% liked)
Technology
80916 readers
3432 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How much of that is the chat bot itself versus humans just being horrible at self reporting symptoms?
That is why "bedside manner" is so important. Connect the dots and ask follow up questions for clarifications or just look at a person and assume they are wrong. Obviously there are some BIG problems with that (ask any black woman, for example) but... humans are horrible at reporting symptoms.
Which gets back to how "AI" is actually an incredible tool (especially in this case when it is mostly a human language interface to a search engine) but you still need domain experts in the loop to understand what questions to ask and whether the resulting answer makes any sense at all.
Yet, instead, people do the equivalent of just raw dogging whatever the first response on stack overflow is.
Rawdogging the first response from stack overflow to try and fix a coding issue isn'f going to kill someone.
It is if your software goes anywhere near infrastructure or safety.
Which is literally what musk and the oligarchs were arguing as a way to "fix" Air Traffic Control. And that is far from the first time tech charlatans have wanted to "disrupt" an industry.
Someone who uses stack overflow to solve a problem will be doing testing to confirm it worked as part of an overall development workflow.
Using an LLM as a doctor is like vibe coding, where there is no testing or quality control.
So... they wouldn't be raw dogging stack overflow? Because raw dogging the code you get from a rando off stack overflow is a bad idea?
Because you can just as easily use generative AI as a component in test driven development. But the people pushing to "make coders more efficient" are looking at firing people. And they continue to not want to add the guard rails that would mean they fire 1 engineer instead of 5.