this post was submitted on 26 Jul 2025
345 points (98.6% liked)

Technology

73232 readers
4191 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spankmonkey@lemmy.world 69 points 21 hours ago (5 children)

Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.

If it is necessary to fact check something every single time you use it, what benefit does it give?

[–] Feyd@programming.dev 7 points 12 hours ago (1 children)

That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it's lying in 2 minutes.

[–] spankmonkey@lemmy.world 4 points 12 hours ago

"Thank you for wasting my time."

[–] A_norny_mousse@feddit.org 30 points 20 hours ago

None. None at all.

[–] brsrklf@jlai.lu 18 points 20 hours ago* (last edited 20 hours ago)

None. It's made with the clear intention of substituting itself to actual search results.

If you don't fact-check it, it's dangerous and/or a thinly disguised ad. If you do fact-check it, it brings absolutely nothing that you couldn't find on your own.

Well, except hallucinations, of course.

[–] artyom@piefed.social -1 points 17 hours ago (1 children)

It hasn't stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.

So yes, it's dumb, but they kind of have to do it at this point. And they need everyone to know it's available from the site they're already using, so they push it on everyone.

[–] spankmonkey@lemmy.world 6 points 15 hours ago (1 children)

No, they don't have to use defective technology just becsuse everyone else is.

[–] artyom@piefed.social -5 points 15 hours ago
[–] XTL@sopuli.xyz -4 points 18 hours ago (3 children)

It might be able to give you tables or otherwise collated sets of information about multiple products etc.

I don't know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It's a bit like using an encyclopedia or a catalog except more convenient and even less reliable.

[–] alsimoneau@lemmy.ca 1 points 22 minutes ago

Or go to Wolfram Alpha and gat actual computations done instead of ramblings?

[–] spankmonkey@lemmy.world 9 points 18 hours ago* (last edited 18 hours ago) (1 children)

Google had a feature for converting units way before the AI boom and there are multiple websites that do conversions and calculations with real logic instead of LLM approximation.

It is more like asking a random person who will answer whether they know the right answer or not. An encyclopedia or catalog at least have some time of a time frame context of when they were published.

Putting the data into tables and other formats isn't helpful if the data is wrong!

[–] A_norny_mousse@feddit.org 3 points 14 hours ago

a feature for converting units

So does DDG

[–] Feyd@programming.dev 1 points 12 hours ago* (last edited 12 hours ago) (2 children)

You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.

[–] alsimoneau@lemmy.ca 1 points 22 minutes ago

On Linux there's also 'units' which is amazing for this.

[–] XTL@sopuli.xyz 1 points 2 hours ago* (last edited 2 hours ago) (1 children)

All things were doable before. The point is that they were manual extra steps.

[–] Feyd@programming.dev 1 points 54 minutes ago

They weren't though. You put stuff in the search bar and it detected you were asking about unit conversion and have you an answer, without ever involving an llm. Are you being dense on purpose?