this post was submitted on 23 Mar 2026
157 points (98.2% liked)

Technology

83027 readers
3536 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zamboni_Driver@lemmy.ca 1 points 23 hours ago (1 children)

Lol

"Do your own research"

Ok Karen sure. It's up to me to prove other peoples random claims that they make on social media. Um no.

[–] Senal@programming.dev 1 points 22 hours ago* (last edited 22 hours ago) (1 children)
[–] Zamboni_Driver@lemmy.ca 1 points 22 hours ago (1 children)

Yea I have zero issue with the fact that accounts with pictures of children's genitals on them should be referred to the the authorities.

If people want privacy, host the pictures locally.

When you're storing images with a cloud provider. They become responsible for the images that they store. If it's a photo of a child's genitals and that's illegal for them to have those images on their servers and they need to protect themselves.

[–] Senal@programming.dev 1 points 21 hours ago

Ah, this is probably my fault.

I'm not the person you were replying to so i wasn't really arguing any of these points, i just a saw the request and knew of an example, so i provided it.

Just in case this was for me specifically I’ll answer:

Yea I have zero issue with the fact that accounts with pictures of children’s genitals on them should be referred to the the authorities.

Pictures of children’s genitals aren’t inherently CSAM, there are plenty of parents and family members with entirely innocent pictures of their kids on their phones.

There are examples of this in the reported cases of false positives leading to bad outcomes, this is easily searchable.

I'm not saying to not do anything, I’m saying blanket reporting is an ineffective brute-force approach.

If people want privacy, host the pictures locally.

In theory yes, in practice, not so much.

on-device scanning exists and is in use/has been in use on phones, examples of this are also easily searchable.

When you’re storing images with a cloud provider. They become responsible for the images that they store. If it’s a photo of a child’s genitals and that’s illegal for them to have those images on their servers and they need to protect themselves.

The need for legal protection is valid, scanning cloud uploaded photo's is a user privacy nightmare, but expected.

End to end encryption (where only the users device can decrypt and see the photo) would probably stand up legally but then they wouldn't be able to use the cloud photo's to make money.

The problem comes with the recognition of illegal and the way it's handled.