this post was submitted on 11 Feb 2026
285 points (98.3% liked)

Technology

81026 readers
4635 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy.

you are viewing a single comment's thread
view the rest of the comments
[–] AnarchistArtificer@slrpnk.net 35 points 12 hours ago

The datasets they are trained on do in fact include CSAM. These datasets are so huge that it easily slips through the cracks. It's usually removed whenever it's found, but I don't know how this actually affects the AI models that have already been trained on that data — to my knowledge, it's not possible to selectively "untrain" models, and they would need to be retrained from scratch. Plus I occasionally see it crop up in the news about how new CSAM keeps being found in the training data.

It's one of the many, many problems with generative AI