this post was submitted on 11 Feb 2026
285 points (98.3% liked)

Technology

81026 readers
4635 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy.

you are viewing a single comment's thread
view the rest of the comments
[–] calcopiritus@lemmy.world 1 points 7 hours ago (1 children)

Did it have any full glasses of water? According to my theory, It has to have data for both "full" and "wine"

[–] vala@lemmy.dbzer0.com 1 points 6 hours ago (1 children)

Your theory is more or less incorrect. It can't interpolate as broadly as you think it can.

[–] calcopiritus@lemmy.world 1 points 20 minutes ago

The wine thing could prove me wrong if someone could answer my question.

But I don't think my theory is that wild. LLMs can interpolate, and that is a fact. You can ask it to make a bear with duck hands and it will do it. I've seen images on the internet of things similar to that generated by LLMs.

Who is to say interpolating nude children from regular children+nude adults is too wild?

Furthermore, you don't need CSAM for photos of nude children.

Children are nude at beaches all the time, there probably are many photos on the internet where there are nude children in the background of beach photos. That would probably help the LLM.