this post was submitted on 06 Jan 2026
595 points (98.1% liked)

Technology

78511 readers
3254 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] VoteNixon2016@lemmy.blahaj.zone 1 points 3 days ago (1 children)

Because with harm reduction as the goal, the solution is never "give them more of the harmful thing."

I'll compare it to the problems of drug abuse. You don't help someone with an addiction by giving them more drugs, you don't help them by throwing them in jail just for having an addiction, you help them by making it safe and easy to get treatment for the addiction.

Look at what Portugal did in the early 2000s to help mitigate the problems associated with drug use, treating it as a health crisis rather than a criminal one.

You don't arrest someone for being addicted to meth, you arrest them for stabbing someone and stealing their wallet to buy more meth; you don't arrest someone just for being a pedophile, you arrest them for abusing children.

This means making AI better, more realistic, and at the same time more diverse.

No, it most certainly does not. AI is already being used to generate explicit images of actual children. Making it better at that task is the opposite of harm reduction, it makes creating new victims easier than ever.

Acting reasonably to prevent what we can prevent means shutting down the CSAM-generating bot, not optimizing and improving it.

[โ€“] Allero@lemmy.today 1 points 3 days ago

To me, it's more like the Netherlands giving out free syringes and needles so that drug consumers at least wouldn't contract something from the used ones.

To be clear: granting any and all pedophiles access to therapy would be of tremendous help. I think it must be done. But there are two issues remaining:

  1. Barely any government will scrape enough money to fund such programs now that therapy is astronomically expensive
  2. Even then, plenty of pedophiles will keep consuming CSAM, legally or not. There must be some incentives for them to choose the AI-generated option that is at least less harmful than the alternative.