this post was submitted on 02 May 2026
1385 points (99.4% liked)
Technology
84302 readers
4123 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you’re comfortable using a NAS, that is probably the best option, but Proton offers drive storage services that are end-to-end encrypted. For photos Ente Photos is a good option.
Google’s algorithms go through all the files you upload to their servers and check them for anything that might go against their terms of service. I mean you could encrypt yourself prior to uploading, but that’s a lot of work. If their algorithm labels even one file as violating their terms of service, they may lock you out of all your data and your account. Their appeal process is useless and is likely just checked by the same algorithm that closed the account in the first place or rubber stamped by a person who goes through thousands of reports a day. Most appeals are rejected and they just delete lifetimes of data/memories like it’s nothing. Of course backups are recommended. Their AI algorithms were rolled out too soon and should never be used as judge, jury, and executioner for people’s data.
Google reports cartoon images and family photos as well. Forbes for reported on it (https://www.forbes.com/sites/thomasbrewster/2021/12/20/google-scans-gmail-and-drive-for-cartoons-of-child-sexual-abuse/). They closed a biggish profile YouTube channels account for cartoons as well (https://en.wikipedia.org/wiki/Naoki_Saito). Same for family photos/medical photos, of which there are plenty of reports from various news networks, the most prominent were probably the three from the NYT (https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html, https://www.nytimes.com/2022/12/30/technology/google-appeals-change.html, and https://www.nytimes.com/2023/11/27/technology/google-youtube-abuse-mistake.html). Plus more from El Pais (https://english.elpais.com/science-tech/2022-09-19/google-closed-my-account-over-sexual-content-but-theyre-not-telling-me-what-it-is-and-ive-lost-everything.html), Buisness Insider (https://www.businessinsider.com/google-users-locked-out-after-years-2020-10?op=1), Android Police (https://www.androidpolice.com/2021/03/08/when-google-locks-you-out-of-your-account-begging-the-internet-for-help-is-your-first-and-last-resort/), India Times (https://english.elpais.com/science-tech/2022-09-19/google-closed-my-account-over-sexual-content-but-theyre-not-telling-me-what-it-is-and-ive-lost-everything.html), etc. And tons of self reporting (https://piunikaweb.com/2026/02/03/google-photos-false-csam-flags-users-locked-out/).
My point is that it is not black and white or as simple as don’t download it. There are plenty of cases in which a person would not know such as downloading an AI training set (https://www.404media.co/a-developer-accidentally-found-csam-in-ai-data-google-banned-him-for-it/). If they truly wanted to follow the law, it would be knowing possession that should end with a persons account being terminated. All other cases should end with maybe the file reported and deleted. But their system is highly flawed and most appeals are denied, which is nonsense when less then 1% of these reports end up with an arrest and even fewer lead to convictions (https://stacks.stanford.edu/file/druid:pr592kc5483/cybertipline-paper-2024-04-22.pdf).
To be honest, I don’t think this is all a failure of Google or Meta or Microsoft, but the NCMEC and Thorn. They are the real threat to child safety, as they use their platform to claim to want to save children, but have other agendas (https://www.techdirt.com/2024/08/08/the-many-reasons-why-ncmecs-board-is-failing-its-mission-from-a-ncmec-insider/ and https://www.jezebel.com/ashton-kutcher-thorn-sex-workers-1850852760). Plus, at least Thorn has been found to lie about their numbers of children rescued (https://www.snopes.com/fact-check/kutcher-software-child-trafficking/).
All Google, and the others, are doing is over reporting and making harder to find actual criminals. It hardly worth celebrating when one is caught while thousands of innocent people are being harmed. There needs to be penalties for false reports or an ability for people to reclaim their data/accounts when cleared of wrongdoing. The number of false positives is absurd and Facebook and LinkedIn researchers have both found it to be highly erroneous (https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse?language=en).
I think we desperately need data privacy and data protection laws. And the “think of the children” or “I have nothing to hide” arguments against them are just trickle down ideas from these data brokers who profit heavily from invading personal data.