this post was submitted on 03 Jun 2025
        
      
      389 points (98.0% liked)
      Technology
    76562 readers
  
      
      2874 users here now
      This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
 - Only tech related news or articles.
 - Be excellent to each other!
 - Mod approved content bots can post up to 10 articles per day.
 - Threads asking for personal tech support may be deleted.
 - Politics threads may be removed.
 - No memes allowed as posts, OK to post as comments.
 - Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
 - Check for duplicates before posting, duplicates may be removed
 - Accounts 7 days and younger will have their posts automatically removed.
 
Approved Bots
        founded 2 years ago
      
      MODERATORS
      
    you are viewing a single comment's thread
view the rest of the comments
    view the rest of the comments
There are no trustworthy LLMs. They don't know it understand what they're saying - they're literally just predicting words that sound like they match what it was taught. It's a only barely smarter than a parrot, and it has no idea how to research anything or tell facts from made-up bullshit. You're wasting your time by trying to force it to do something it's literally incapable of doing.
You're better off researching them the hard way; check primary sources and then check the credibility of those sources.
Considering that parrots can have actual thoughts, I'd say LLMs are even less smart than that.