this post was submitted on 09 Jan 2026
1274 points (99.0% liked)

Technology

78511 readers
2999 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] abbiistabbii@lemmy.blahaj.zone 40 points 1 day ago* (last edited 1 day ago) (3 children)

I mean Grok is literally being used to make CSAM so it should not only be taken down but the law should get involved in any reasonable country.

[–] QuandaleDingle@lemmy.world 2 points 3 hours ago* (last edited 3 hours ago)

It's just crazy dude. Give pedos an opportunity, and they WILL take advantage of it. You could be making an image hosting app for frickin octogenarians and they'll somehow discover it and use it as their spank bank. I hate that shit.

[–] bearboiblake@pawb.social 5 points 17 hours ago

how many times do we have to say it

the law is optional for corporations and the rich

the law is for oppressing the poor and justifying it

[–] Bunbury@feddit.nl 1 points 17 hours ago* (last edited 17 hours ago) (1 children)

The thing is though… is it actually CSM according to the law if there is no real child involved along the way? At least some countries might need a law change first before tackling this.

Edit: I checked the local law where I’m located and it’s “images or videos depicting a minor during a sexual act”. I’d argue these AI generated images aren’t of minors because they don’t exist. We probably need new laws first unless it’s provable that the images represent a specific real life minor.

[–] abbiistabbii@lemmy.blahaj.zone 4 points 16 hours ago (2 children)

The thing is though… is it actually CSM according to the law if there is no real child involved along the way?

Legally, yes. The law in the UK also includes images of children that are drawn or computer generated.

[–] k0e3@lemmy.ca 3 points 4 hours ago (1 children)

I had kind of assumed this was the case in a lot of countries with one of the more prominent exception being Japan (we have a lot of fucked up manga/anime).

[–] abbiistabbii@lemmy.blahaj.zone 3 points 3 hours ago

Japan had legal Child Porn until disconcertingly recently.

[–] Bunbury@feddit.nl 2 points 16 hours ago

Good. I also read a bit more into it and while I’m not located in the UK apparently there is a specific extension to the law that also includes drawings and “collage” type edits. It doesn’t mention AI yet but surely this should count.

RIP my search history. But I’m glad we don’t need to wait for the law to catch up (in some places). Now all we have to do is scream from the rooftops that the laws need to be applied. Or even better: sue someone.