this post was submitted on 23 Mar 2026
156 points (98.1% liked)

Technology

83027 readers
3530 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] obvs@lemmy.world 16 points 1 day ago (2 children)

Unfortunately, the negative effects from companies like Google turning in completely ethical people for doing things that should be completely legal and uncontroversial will do drastically more damage than the positive effects from said companies turning in the poorest of the pedophiles.

[–] Zamboni_Driver@lemmy.ca -4 points 1 day ago (1 children)
[–] obvs@lemmy.world 5 points 1 day ago (1 children)

The company is literally building death camps, installing statues of genociders, is run by the RICH pedophiles(who have ZERO interest in seeing pedophiles prosecuted), and is using Palantir and Flock cameras to monitor everything, meanwhile having secret police disappear people and just openly slaughter them.

The United States Government is well beyond deserving the benefit of the doubt.

[–] Zamboni_Driver@lemmy.ca -2 points 1 day ago (2 children)

Great do you have a single example of what you're claiming, lol. Google turning in a perfectly ethical person for doing something that should be legal and uncontroversial.

You're moving the goal posts and changing your argument.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (2 children)

Try reading the thread

https://www.cbc.ca/lite/story/9.7115031

This was posted 9 hours before your whinge

[–] Zamboni_Driver@lemmy.ca 0 points 1 day ago (1 children)

I am so confused. Did you read the article that you posted???? Are you just straight up defending pedophilia and rape?

The Toronto detective alleges that after the alerts were passed to the RCMP and then Toronto police, she looked at three of the images and found they depicted naked prepubescent girls. The images included an explicit sex act and exposed genitals.

depicted who I believe to be David Edward-Ooi Poon without a shirt, taking a selfie of himself while sticking out his tongue over an unconscious adult female," the search-warrant application states. The document goes on to describe the woman in the photo as naked below the waist and wearing a dark-coloured eye mask over her eyes. The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled "Girls I Drugged And Raped."

The images included adult females with breasts and genitals exposed "who appeared to be unconscious," the ITO says. "The body positioning of the females appeared to be limp and did not significantly change throughout the images taken." Police allege they found other files on the iPhone that appeared to be "upskirt" images or photographs focusing on the buttocks of females, in folders with names suggesting they were underage girls.

Detectives laid 41 more charges in December including making and possessing child pornography, sexual assault, voyeurism for a sexual purpose and drugging someone to facilitate sexual assault.

Either you can't read, or you are an incredibly disgusting person.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

Nah I copied the wrong link. There's one about a Swedish dude, but go ahead. Did you not notice you were reading the same article as the thread is about or did you skip reading it the first time?

[–] Zamboni_Driver@lemmy.ca 0 points 20 hours ago (1 children)

Lol yes I did notice.

"The wrong link"

"There's one about a Swedish dude"

The gymnastics you're going through to avoid actually facts is hilarious.

[–] HeyThisIsntTheYMCA@lemmy.world 1 points 19 hours ago (1 children)

so you skipped reading it both times, huh

[–] Zamboni_Driver@lemmy.ca 0 points 17 hours ago (1 children)

I think that you've lost the plot. That reply doesn't even make sense in the context of this conversation.

Yes, I read the article that you linked and which is in the OP in which you believe that someone who drugs and rapes people is an ethical person acting legally.

Could you please fly to a different country, retake basic kindergarten to grade 12 education, and then rejoin this conversation once you've acquired the skills to make logical and informed arguments. Thanks.

[–] HeyThisIsntTheYMCA@lemmy.world 0 points 17 hours ago* (last edited 17 hours ago) (1 children)

you're seeing red because the conversation is about a child abuser. you cannot think straight. someone posts a link and oopsie poopsie they copy the one from the top of the conversation instead of the bottom (you haven't even bothered to read either link, not even while copy and pasting every line) and now you want to deport them. That is literally all that has happened. I haven't said shit. You are a fucking nazi.

[–] Zamboni_Driver@lemmy.ca 1 points 11 hours ago

You posted one link. lmao, thinking I'm seeing red yet you start calling people nazi's because you don't know how to use a computer.

I haven’t said shit. You are a fucking nazi.

The pinnacle of intelligent discourse.

[–] Senal@programming.dev 0 points 1 day ago (1 children)
[–] Zamboni_Driver@lemmy.ca 1 points 20 hours ago (1 children)

Lol

"Do your own research"

Ok Karen sure. It's up to me to prove other peoples random claims that they make on social media. Um no.

[–] Senal@programming.dev 1 points 20 hours ago* (last edited 20 hours ago) (1 children)
[–] Zamboni_Driver@lemmy.ca 1 points 20 hours ago (1 children)

Yea I have zero issue with the fact that accounts with pictures of children's genitals on them should be referred to the the authorities.

If people want privacy, host the pictures locally.

When you're storing images with a cloud provider. They become responsible for the images that they store. If it's a photo of a child's genitals and that's illegal for them to have those images on their servers and they need to protect themselves.

[–] Senal@programming.dev 1 points 19 hours ago

Ah, this is probably my fault.

I'm not the person you were replying to so i wasn't really arguing any of these points, i just a saw the request and knew of an example, so i provided it.

Just in case this was for me specifically I’ll answer:

Yea I have zero issue with the fact that accounts with pictures of children’s genitals on them should be referred to the the authorities.

Pictures of children’s genitals aren’t inherently CSAM, there are plenty of parents and family members with entirely innocent pictures of their kids on their phones.

There are examples of this in the reported cases of false positives leading to bad outcomes, this is easily searchable.

I'm not saying to not do anything, I’m saying blanket reporting is an ineffective brute-force approach.

If people want privacy, host the pictures locally.

In theory yes, in practice, not so much.

on-device scanning exists and is in use/has been in use on phones, examples of this are also easily searchable.

When you’re storing images with a cloud provider. They become responsible for the images that they store. If it’s a photo of a child’s genitals and that’s illegal for them to have those images on their servers and they need to protect themselves.

The need for legal protection is valid, scanning cloud uploaded photo's is a user privacy nightmare, but expected.

End to end encryption (where only the users device can decrypt and see the photo) would probably stand up legally but then they wouldn't be able to use the cloud photo's to make money.

The problem comes with the recognition of illegal and the way it's handled.