this post was submitted on 24 Mar 2026
351 points (99.2% liked)

Technology

83027 readers
3481 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A New Mexico jury ruled Tuesday that Meta knowingly harmed children's mental health and concealed what it knew about child sexual exploitation on its social media platforms.

The landmark decision comes after a nearly seven-week trial, and as jurors in a federal court in California have been sequestered in deliberations for more than a week about whether Meta and YouTube should be liable in a similar case.

Jurors sided with state prosecutors who argued that Meta — which owns Instagram, Facebook and WhatsApp — prioritized profits over safety. The jury determined Meta violated parts of the state's Unfair Practices Act on accusations the company hid what it knew about about the dangers of child sexual exploitation on its platforms and impacts on child mental health.

The jury agreed with allegations that Meta made false or misleading statements and also agreed that Meta engaged in "unconscionable" trade practices that unfairly took advantage of the vulnerabilities of and inexperience of children.

Jurors found there were thousands of violations, each counting separately toward a penalty of $375 million.

you are viewing a single comment's thread
view the rest of the comments

I see where you're coming from. It's not really illogical though — it logically follows that if one did the same thing as the other but worse (use AI to seduce kids vs use AI to generate CSAM), then if the weaker one is found illegal, the stronger one should be found more illegal.

It might be a non sequitur though ("it does not follow") because, as you say, Twitter wasn't involved in that case.

Still worth mentioning.

My local news is full of people who were found to be looking at CSAM and the years of prison time they're getting. Basically they were found to have naked pictures of kids, or pictures/video of people abusing kids on their phone or computer. But they haven't actually harmed any real kids, IRL. Meanwhile, western politicians are actually doing the thing and not being punished. I don't know if Trump has any CSAM on his phone (probably not), but he's gone to a private island to abuse minors, and so have a bunch of other people. One in the UK got stripped of his titles, but still lives a life of luxury. I wanna say "make it make sense" but it seems to me that it's not doing the thing that is punishable by prison, it's having the proof someone else did the thing. And that's sick. But that's the world we live in.