this post was submitted on 06 Jan 2026
595 points (98.1% liked)

Technology

78511 readers
3254 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] VoteNixon2016@lemmy.blahaj.zone 3 points 4 days ago (1 children)

We could feed the pedo AI more training data and make it even better!

No.

[–] Allero@lemmy.today 1 points 3 days ago* (last edited 3 days ago) (1 children)

Why though? If it does reduce consumption of real CSAM and/or real life child abuse (which is an "if", as the stigma around the topic greatly hinders research), it's a net win.

Or is it simply a matter of spite?

Pedophiles don't choose to be attracted to children, and many have trouble keeping everything at bay. Traditionally, those of them looking for the least harmful release went for real CSAM, but it's obviously extremely harmful in its own right - just a bit less so than going out and raping someone. Now that AI materials appear, they may offer the safest of the highly graphical outlets we know, with least child harm done. Without them, many pedophiles will revert to traditional CSAM, increasing the amount of victims to cover for the demand.

As with many other things, the best we can hope for here is harm reduction. Hardline policies do not seem to be efficient enough, as people continuously find ways to propagate the CSAM and pedophiles continuously find ways to access it and leave no trace. So, we need to think of ways to give them something which will make them choose AI over real materials. This means making AI better, more realistic, and at the same time more diverse. Not for their enjoyment, but to make them switch for something better and safer than what they currently use.

I know it's a very uncomfortable kind of discussion, but we don't have the magic pill to eliminate it all, and so must act reasonably to prevent what we can prevent.

[–] VoteNixon2016@lemmy.blahaj.zone 1 points 3 days ago (1 children)

Because with harm reduction as the goal, the solution is never "give them more of the harmful thing."

I'll compare it to the problems of drug abuse. You don't help someone with an addiction by giving them more drugs, you don't help them by throwing them in jail just for having an addiction, you help them by making it safe and easy to get treatment for the addiction.

Look at what Portugal did in the early 2000s to help mitigate the problems associated with drug use, treating it as a health crisis rather than a criminal one.

You don't arrest someone for being addicted to meth, you arrest them for stabbing someone and stealing their wallet to buy more meth; you don't arrest someone just for being a pedophile, you arrest them for abusing children.

This means making AI better, more realistic, and at the same time more diverse.

No, it most certainly does not. AI is already being used to generate explicit images of actual children. Making it better at that task is the opposite of harm reduction, it makes creating new victims easier than ever.

Acting reasonably to prevent what we can prevent means shutting down the CSAM-generating bot, not optimizing and improving it.

[–] Allero@lemmy.today 1 points 3 days ago

To me, it's more like the Netherlands giving out free syringes and needles so that drug consumers at least wouldn't contract something from the used ones.

To be clear: granting any and all pedophiles access to therapy would be of tremendous help. I think it must be done. But there are two issues remaining:

  1. Barely any government will scrape enough money to fund such programs now that therapy is astronomically expensive
  2. Even then, plenty of pedophiles will keep consuming CSAM, legally or not. There must be some incentives for them to choose the AI-generated option that is at least less harmful than the alternative.