this post was submitted on 07 Jan 2026
278 points (98.9% liked)

Technology

78511 readers
4095 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it's no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.

you are viewing a single comment's thread
view the rest of the comments
[–] riskable@programming.dev 95 points 3 days ago (7 children)

The real problem here is that Xitter isn't supposed to be a porn site (even though it's hosted loads of porn since before Musk bought it). They basically deeply integrated a porn generator into their very publicly-accessible "short text posts" website. Anyone can ask it to generate porn inside of any post and it'll happily do so.

It's like showing up at Walmart and seeing everyone naked (and many fucking), all over the store. That's not why you're there (though: Why TF are you still using that shithole of a site‽).

The solution is simple: Everyone everywhere needs to classify Xitter as a porn site. It'll get blocked by businesses and schools and the world will be a better place.

[–] givesomefucks@lemmy.world 16 points 3 days ago* (last edited 3 days ago) (1 children)

The solution is simple: Everyone everywhere needs to classify Xitter as a porn site

I think a large part of it's popularity has become the porn, because it passes all those filters. Especially since Musk backed conservatives are blocking porn in red states, but as far as I know, never twitter.

Treat it like a porn site and lots of Republicans need to give up their ID to show they're old enough. They can't VPN around it because social media hates VPN

[–] other_cat@piefed.zip 1 points 2 days ago (1 children)

I know this isn't the intent but oh god I never considered that porn sites being blocked would open an avenue for a bloated billionaire to try to capitalize on it for themselves.

[–] foggenbooty@lemmy.world 3 points 2 days ago (1 children)

Every new law changes the rules of the game. Every new rule presents an opportunity to those who can circumvent it.

The wealthy have always succeded by gaming the system.

[–] Quexotic@infosec.pub 1 points 2 days ago

They had already succeeded because they had the workaround figured out when their friends made the rule.

[–] wltr@discuss.tchncs.de 5 points 3 days ago

I wonder, just another rename, X → XXX, would do well, wouldn’t it?

[–] judgyweevil@feddit.it 13 points 3 days ago

I bet that many are simply ignorant of this new problem

[–] db2@lemmy.world 5 points 3 days ago

It's like showing up at Walmart and seeing everyone naked (and many fucking), all over the store.

🤢🤮

[–] Lemming6969@lemmy.world 3 points 3 days ago* (last edited 3 days ago) (1 children)

The real problem is that we ever gave a shit about human bodies, especially fake ones.

[–] riskable@programming.dev 2 points 3 days ago

I don't know how to tell you this but... Every body gives a shit. We're born shitters.

[–] a_non_monotonic_function@lemmy.world 2 points 3 days ago (1 children)

I thought the real problem is that it is generating *illegal porn.

[–] riskable@programming.dev 2 points 2 days ago (1 children)

Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I'm sure they're working on it (it's actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there's always going to be an infinite number of ways to trick it since LLMs aren't actually intelligent).

Porn itself is not illegal.

He has 100% control over the ability to alter or pull this product. If he's leaving it up while he's generating illegal pornography that is on him.

And no s*** I'm concerned about the illegal stuff.

[–] mjr@infosec.pub 1 points 3 days ago (1 children)

(though: Why TF are you still using that shithole of a site‽).

Maybe some places don't have alternative suppliers than Walmart? Similarly, some places have governments that still only use the porno social network for some services.

[–] silence7@slrpnk.net 17 points 3 days ago (2 children)

Why the &#**### is California putting Amber Alerts on a porn site?

[–] athatet@lemmy.zip 7 points 3 days ago

Bastard? Idk what other swear has that many letters.

[–] riskable@programming.dev 3 points 3 days ago

I don't know, man... Have you even seen Amber? It might be worth an alert 🤷