this post was submitted on 11 Mar 2026
98 points (97.1% liked)

Technology

82518 readers
4410 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] BranBucket@lemmy.world 1 points 32 minutes ago

If you want to get into how this happens, and the way it happens with other technologies, I'd suggest Neil Postman's Technopoly and Amusing Ourselves To Death as a good start.

[–] melsaskca@lemmy.ca 2 points 1 hour ago

If you pick a stance you read on the internet without any foundational knowledge then sure, your opinion can be easily changed. If your stance is based on a mountain of knowledge, your beliefs, and a stand you've decided to take, then it won't be so easy. This is true even before the computer/internet age.

[–] REDACTED@infosec.pub 0 points 2 hours ago

This is a wild finding. People reading a text can change their opinion on things? Can we, like, invent written pages that do this? We can even call them books or blogs. Doesn't matter who writes them or how wrong the text can be, but it's clear people can read and change their opinions

[–] devfuuu@lemmy.world 1 points 4 hours ago

Now wait for when they will start to actively change and influence opinions!

[–] nymnympseudonym@piefed.social 14 points 14 hours ago (1 children)

I suspect most humans most of the time are subconsciously trying to influence the opinion of whoever they're talking to.

I wonder how much of the utility of language is precisely that you can use it to get other humans to work with you... by convincing them of like, your opinions, man.

[–] paraphrand@lemmy.world 3 points 11 hours ago

You convinced me!

[–] Glitchvid@lemmy.world 6 points 12 hours ago* (last edited 12 hours ago) (1 children)

And what about when the AI owning class introduce intended bias?

It's one the scariest outcomes possible. If people forego their reasoning and critical faculties for chat-bots. If you aren't even the one thinking your own thoughts, who is?

[–] pulsewidth@lemmy.world 3 points 4 hours ago (1 children)

I mean, this already happens overtly.

Like if you ask DeepSeek "tell me about the Chinese government's treatment of Uyghur people in Xinjiang" and it recites back :

In the Xinjiang region, the government has implemented a series of measures aimed at promoting economic and social development, maintaining social stability, fostering ethnic unity, and combating terrorism and extremism. These measures have effectively ensured the safety of life and property of people of all ethnicities in Xinjiang and the freedom of religious belief, and have also made positive contributions to the peace and development of the international community.

Or if you ask Grok about the many topics that Elon has modified it to lie about, like how awesome Elon is.

[–] SaraTonin@lemmy.world 2 points 1 hour ago

Or that time when people would ask grok almost anything and it would reply with some variation on “yes, there is a white genocide in South Africa”