this post was submitted on 29 Oct 2025
730 points (99.6% liked)

Not The Onion

18491 readers
1528 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Stamau123@lemmy.world 20 points 1 day ago (1 children)

Hahaha, what a story, Mark! Anyway, how's your sex life?

[–] pyre@lemmy.world 2 points 5 hours ago

"define 'sex'. no, wait... define 'life'."

[–] Eh_I@lemmy.world 7 points 1 day ago (2 children)

Well, someone at facebook needs to come collect their jerk-off trophy.

[–] Bloefz@lemmy.world 1 points 5 hours ago

I think they already came

[–] phutatorius@lemmy.zip 5 points 14 hours ago

Zuck, you dirty wanker.

[–] ignotum@lemmy.world 60 points 1 day ago

"hey steve, did you download a shitton of porn while on the company network?"
"Uuuhhhhh, it's for ai training"

[–] RedFrank24@lemmy.world 59 points 1 day ago (3 children)

Torrent the Dark Knight to watch at home along and the media companies will sue you for infinity billion dollars. Openly torrent every movie known to man to train an AI and the media companies don't do shit.

[–] manuallybreathing@lemmy.ml 8 points 1 day ago (3 children)

Steal 1 movie, you're a murderer, steal 1million, you're a conquerer, steal 'em all? You're a goooooooood~

this comment is oc, DO NOT steal

[–] Akanes@startrek.website 18 points 1 day ago

Too late, but it's ok, i'll only keep it for personal use

[–] markz@suppo.fi 3 points 1 day ago* (last edited 1 day ago) (1 children)

You wouldn't steal a car.

You wouldn't right click and save an NFT.

Piracy. It's a crime.

[–] Knock_Knock_Lemmy_In@lemmy.world 2 points 8 hours ago (1 children)
[–] markz@suppo.fi 2 points 8 hours ago (2 children)

Ah, so the anti pirate lobby still pirated the composers music. Just not for the famous ad.

[–] Jarix@lemmy.world 2 points 5 hours ago

Much appreciated. Thank you

load more comments (1 replies)
load more comments (2 replies)
[–] Asafum@feddit.nl 96 points 1 day ago (3 children)

"So just to get this straight, you're saying you have downloaded 152 .... zettabytes .... of porn for your own personal use?"

[–] Part4@infosec.pub 8 points 1 day ago

Pirated copyright material for training ai is measured in Zuckerbytes, not zettabytes.

[–] FuglyDuck@lemmy.world 29 points 1 day ago (1 children)

"....Zetabaytes? no. it was yottabytes"

load more comments (1 replies)
[–] Taleya@aussie.zone 8 points 1 day ago

rookie numbers for a bunch of 20 something programmers tbh

[–] Apeman42@lemmy.world 194 points 2 days ago (3 children)

Zuck's just an omega-level gooner, swearsies!

[–] markz@suppo.fi 91 points 2 days ago (1 children)

Why do you think Facebook exists?

[–] Mac@mander.xyz 22 points 2 days ago (1 children)

Basically its inception tbh

load more comments (1 replies)
load more comments (2 replies)
[–] phoenixz@lemmy.ca 34 points 1 day ago

An even more “glaring” defect, Meta argued, is that Meta’s terms prohibit generating adult content, “contradicting the premise that such materials might even be useful for Meta’s AI training.”

Oh yes, this is true because meta, or any other company for that matter, has never ever in the history of ever changed its terms of service...

[–] MachineFab812@discuss.tchncs.de 91 points 2 days ago* (last edited 1 day ago) (6 children)

Where did they get the idea that that's a more respectable response?

EDIT: Doesn't/shouldn't work for their liability either. Vocabulary fail on my part.

[–] markz@suppo.fi 93 points 2 days ago (2 children)

Dignity doesn't matter if it wins the case.

Welcome to Zucc's Fucc & Succ

[–] Tm12@lemmy.ca 32 points 2 days ago

Zuck’s Fuck and Suck sounds like his product’s effects on society.

load more comments (1 replies)
load more comments (5 replies)
[–] BussyGyatt@feddit.org 6 points 1 day ago

yes, and epstein trafficked all those children to himself.

[–] besselj@lemmy.ca 74 points 2 days ago (6 children)

Did anyone check on the Meta engineers in the goon cave?

[–] stevedice@sh.itjust.works 2 points 5 hours ago* (last edited 5 hours ago)

You're joking but it was 2400 movies over 7 years downloaded individually and not in bulk like they blatantly did with books. Apparently over 60% of adults admit to have viewed porn at work so yeah... someone should probably check on the engineers in the goon cave.

[–] NateNate60@lemmy.world 38 points 2 days ago* (last edited 2 days ago) (2 children)

I would not be surprised if Meta advertised such a thing to prospective employees as a legitimate benefit of the job. A built-in VR goon cave with 30 TB of material available. Limit 1 hour per person, bookings required 6 months in advance. Sessions subject to monitoring for security and training purposes. May contain trace amounts of Zuck.

load more comments (2 replies)
load more comments (4 replies)
[–] WhatGodIsMadeOf@feddit.org 50 points 2 days ago

Humanity is fucked letting these companies exist.

[–] phoenixz@lemmy.ca 24 points 1 day ago* (last edited 1 day ago) (2 children)

One wonders how much child porn was in there....

But it's AI, so itsa aaaaalllll fine

[–] manuallybreathing@lemmy.ml 5 points 1 day ago (2 children)

It's better to say child sex abuse material (csam), the term "child porn" both legitimizes the conent, and infers children could ever be active and consenting participants

sexualised content involving children is abuse and should be labled as so

load more comments (2 replies)
load more comments (1 replies)
[–] BilSabab@lemmy.world 11 points 1 day ago (7 children)

Ok, but why would anyone bother training AI on porn? Seriously, I don't understand

[–] Evotech@lemmy.world 1 points 7 hours ago* (last edited 7 hours ago)

Its the best datasets for human anatomy.

[–] calcopiritus@lemmy.world 21 points 1 day ago (2 children)

Gen AI porn and shitposts are the only 2 decent use cases I've seen of gen AI.

You can't make half of those without training it on porn.

[–] BanMe@lemmy.world 3 points 1 day ago

Just to add I've seen a full 10 second clip of AI gay porn and it was very, very realistic, with only some minor things that set it off as fake. It's gotten to the point where I'm afraid I'm cumming to clankers now.

load more comments (1 replies)
[–] moondoggie@lemmy.world 19 points 1 day ago (1 children)

An article I read when this came up a couple of months ago said that basically porn was the best way to show AI unclothed human movement. Watching clothed humans move, you can get the basics but can’t see how the muscles are working. If you show it a Hollywood movie, they might see occasional shirtless scenes or artfully lit and blocked out sex scenes. Porn has the greatest amount of naked people moving their bodies.

[–] BilSabab@lemmy.world 8 points 1 day ago (1 children)
load more comments (1 replies)
[–] mangaskahn@lemmy.world 10 points 1 day ago (1 children)

Video generation, copyright matching, CSAM detection, those are just the first few that pop into my head.

load more comments (1 replies)
[–] falseWhite@lemmy.world 8 points 1 day ago (7 children)

Have you not heard that everyone is now doing NSFW chatbots? Meta is just trying to catch up with Grok and Chatgpt

load more comments (7 replies)
load more comments (2 replies)
[–] vane@lemmy.world 21 points 1 day ago

that's mafia level response

[–] Lodespawn@aussie.zone 32 points 2 days ago

I can only assume the MPAA will funnel vast sums of money into helping prosecute these thieves? Any minute now right?

[–] Krompus@lemmy.world 1 points 1 day ago
load more comments
view more: next ›