this post was submitted on 11 Dec 2025
581 points (96.5% liked)

Technology

77096 readers
2876 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Devial@discuss.online 186 points 3 days ago* (last edited 3 days ago) (44 children)

The article headline is wildly misleading, bordering on being just a straight up lie.

Google didn't ban the developer for reporting the material, they didn't even know he reported it, because he did so anonymously, and to a child protection org, not Google.

Google's automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account.

Google's only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google's part, I find it very understandable that the appeals team generally speaking won't accept "I didn't know the folder I uploaded contained CSAM" as a valid ban appeal reason.

It's also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.

[–] forkDestroyer@infosec.pub 23 points 2 days ago (5 children)

I'm being a bit extra but...

Your statement:

The article headline is wildly misleading, bordering on being just a straight up lie.

The article headline:

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

The general story in reference to the headline:

  • He found csam in a known AI dataset, a dataset which he stored in his account.
  • Google banned him for having this data in his account.
  • The article mentions that he tripped the automated monitoring tools.

The article headline is accurate if you interpret it as

"A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It" ("it" being "csam").

The article headline is inaccurate if you interpret it as

"A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It" ("it" being "reporting csam").

I read it as the former, because the action of reporting isn't listed in the headline at all.

^___^

[–] WildPalmTree@lemmy.world 1 points 1 day ago

The inclusion of "found" indicates that it is important to the action taken by Google, would be my interpretation.

load more comments (4 replies)
load more comments (42 replies)