388
this post was submitted on 07 Mar 2026
388 points (99.2% liked)
Technology
82363 readers
3844 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I could see the argument for things that aren't particularly important, but to continue with the legal example, it seems akin to asking a practicing lawyer a question and asking someone that watched Boston Legal when it aired and can quote James Spader.
Unfortunately, with the potential for a hallucinatory response, anything beyond quite simplistic queries shouldn't be relied on with more weight than a crutch of toothpicks.
I don't think you are wrong, but again, thats not the case.
You're making an argument about speech here.
Lets say you make a fan website based entirely on fine tuned LLM which acts and responds as James Spader from Boston legal. Are you liable if a user of that website construes that speech as legal advice?
If you are willing to give up access to speech so easily, I have almost no hope for Americans in the near future.
What laws like this do is create an incredibly high pass filter to in positions of established power. Its literally suicidal in regards to freedom of speech on the internet.
The right answer is that if you are dumb enough to have gotten your legal advice from an AI hallucination of James Spader, you get to absorb those consequences. The wrong answer is to tell people they aren't allowed to build fan websites of James Spader giving questionable legal advice.
I'm your example, say you go to a lawyer and ask legal questions. If the lawyer is not providing legal advise (I. e. taking on the role of being your lawyer and representing you in that matter), they are required by law to express that at the begining so that they will not be held liable because they are a legal professional.
Wikipedia, Google, chatgpt etc are not legal authorities or legal professionals.
There is also no human entity to hold legally responsible if the LLM hallucinates or sites a source that is not factual (satire for instance).
We also know that the vast majority of people who use chatbots do not get the sources they come from.
So. When Wikipedia presents information it is not giving legal advice. That is born out in case law.
The reason it's dangerous to get legal or health information from a chatbot is the same reason you wouldn't want to randomly trust reddit.
No lawyers are going to reddit to get help writing legal briefs. We have seen lawyers using LLM'S for that though.
Presumably such a site would be visually obvious as parody. Having it give jokey answers in as a caricature would be one thing. If you dressed it up as a professional legal advice service for opinions on criminal law from Alan Shore, that could be problematic.
At a certain point of information sharing, we should want a high bar for the ones providing the answers. When asking nuanced questions, we should want for the answer to come from knowledge, not memory. I made an example in this other comment.
I'm not sure I agree with your 'right answer' bit. Personally, I'd prefer dumb people to be protected in a similar way that I want the elderly protected from losing their savings from an email scam.
I promise you, the result of this will be unlimited free speech for corporations and their LLMs, with limited and regulated free speech for you. Save or favorite the comment.
It's the same "protect the children" anti free speech advocacy in a different wrapper, but more appealing to this audience because "llm bad".
They're using your emotional response to not liking LLMs as a tool to trick you into giving away your rights.