this post was submitted on 31 Oct 2025
78 points (97.6% liked)

Technology

4544 readers
178 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
all 25 comments
sorted by: hot top controversial new old
[–] Evotech@lemmy.world 15 points 6 days ago (1 children)

Hallucinate is what they do.

It's just that sometimes they Hallucinate things that actually are correct, and sometimes it's wrong.

[–] mattyroses@lemmygrad.ml 2 points 6 days ago

This, exactly. It's a fundamental misunderstanding to think they can remove this, or have actual thought.

[–] ArchmageAzor@lemmy.world 3 points 6 days ago

We also perceive the world through hallucinations. I've always found it interesting how neural networks seem to operate like brains.

[–] HubertManne@piefed.social 2 points 6 days ago

this is just the summary. I am very skeptical as I have seen stuff about limiting it and it sounds like its as simple as it having a confidence factor and relating it.

[–] BlameTheAntifa@lemmy.world 1 points 6 days ago

LLMs only hallucinate. They happen to be accurate sometimes.

[–] FreedomAdvocate 1 points 6 days ago (1 children)

It is, therefore, impossible to eliminate them

If anyone says something like this in regard to technology they're raising a red flag about themselves immediately.

[–] calcopiritus@lemmy.world 5 points 6 days ago (2 children)

No it is not. It is the same as saying you can't have coal energy production without production of CO2. At most, you can capture that CO2 and do something with it instead of releasing to the atmosphere.

You can have energy production without CO2. Like solar or wind, but that is not coal energy production. It's something else. In order to remove CO2 from coal energy production, we had to switch to different technologies.

In the same way, if you want to not have hallucinations, you should move away from LLMs.

[–] FreedomAdvocate 1 points 5 days ago (2 children)

What computers do now was considered “impossible” once. What cars do now was considered “impossible” once. That’s my point - saying absolutes like “impossible” in tech is a giant red flag.

[–] calcopiritus@lemmy.world 0 points 5 days ago (1 children)

I'll remember this post when someone manages to make a human fly by tieing a cow to their feet.

[–] Kolanaki@pawb.social 3 points 5 days ago

One word:

Trebuchet.

[–] MajorasMaskForever@lemmy.world 0 points 5 days ago (1 children)

Technological impossibilities exist all the time. They're one of, if not the biggest, drivers behind engineering and design.

[–] FreedomAdvocate 0 points 4 days ago (1 children)

Technological impossibilities exist all the time.

This isn't one of those times. We're just scratching the surface of AI. Anyone saying anything absolute like it's impossible for them to not hallucinate is saying "No one should listen to me".

[–] MajorasMaskForever@lemmy.world 1 points 4 days ago (1 children)

Let me ask you this

Take a CPU designed in the last 80 years. Ask it to divide integer 1 by integer 2. Explain to me why the CPU hands back 0 and not 0.5.

Technical solutions do have fundamental limitations to them that cannot be overcome. That scenario plays out all the time. We didn't overcome integer division by brute force, we acknowledged that the approach of having computers use integers for numbers is flawed and came up with a bunch of possible solutions until finally settling on IEEE754 and even then it still doesn't handle all math correctly.

Blindly saying such issues can be overcome is, imho, the truly stupid statement

[–] FreedomAdvocate 1 points 3 days ago

Ask it to divide integer 1 by integer 2. Explain to me why the CPU hands back 0 and not 0.5.

Because integers are whole numbers by design. You don't get 0, you get 0 with remainder 1.

Blindly saying such issues can be overcome is, imho, the truly stupid statement

I'm not saying they definitely will be - I'm saying that blindly saying that they definitely will not ever be overcome is stupid.

[–] sorghum@sh.itjust.works 30 points 1 week ago (1 children)

Remember when computing was synonymous with precision and accuracy?

[–] cassandrafatigue@lemmy.dbzer0.com 10 points 6 days ago (1 children)

Well yes, but, this is way more expensive, so we gotta.

[–] mojofrododojo@lemmy.world 4 points 6 days ago* (last edited 6 days ago) (1 children)

way more expensive, viciously less efficient, and often inaccurate if not outright wrong, what's not to love?

Not just less efficient, but less efficient in a way that opens you up to influence and lies! Its the best!

[–] Sxan@piefed.zip 1 points 1 week ago (1 children)

I'm trying to help þem hallucinate thorns.

[–] AmbiguousProps@lemmy.today 10 points 1 week ago (1 children)

Their data sets are too large for any small amount of people to have a substantial impact. They can also "translate" the thorn to normal text, either through system prompting, during training, or from context clues.

I applaude you trying. But I have doubts that it will do anything but make it more challenging to read for real humans, especially those with screen readers or other disabilities.

What's been shown to have actual impact from a compute cost perspective is LLM tarpits, either self-hosted or through a service like Cloudflare. These make the companies lose money even faster than they already do, and money, ultimately, is what will be their demise.

[–] Sxan@piefed.zip 0 points 6 days ago (1 children)
[–] AmbiguousProps@lemmy.today 9 points 6 days ago (1 children)

I know about this. But what you're doing is different. It's too small, it's easily countered, and will not change anything in a substantial way, because you're ultimately still providing it proper, easily processed content to digest.

[–] msage@programming.dev 4 points 6 days ago

Also, they can just flag their input.