Won't work and if it does work, the resulting image has little to nothing to do with the original.
Source: I opened a badly taken .raw file a few thousand times and I know what focal length means, come at me.
This is a most excellent place for technology news and articles.
Won't work and if it does work, the resulting image has little to nothing to do with the original.
Source: I opened a badly taken .raw file a few thousand times and I know what focal length means, come at me.
I am so glad I no longer interact with that dumpster fire of a social network. It's like the Elon takeover and the monetization program brought out every weirdo in the world out of the woodwork
Sounds about right for x users
late sex offender Jeffrey Epstein
I'm so done with all the whitewashing. "Sex offender" sounds like I behaved wrong in consensual sex. What this prick was is a pedophile. A child rapist. A kid-abuser and -rapist. But surely no "late financier" or whatever else media chose over the facts.
Also a slaver and child abductor.
How do these AI models generate nude imagery of children without having been trained with data containing illegal images of nude children?
The datasets they are trained on do in fact include CSAM. These datasets are so huge that it easily slips through the cracks. It's usually removed whenever it's found, but I don't know how this actually affects the AI models that have already been trained on that data — to my knowledge, it's not possible to selectively "untrain" models, and they would need to be retrained from scratch. Plus I occasionally see it crop up in the news about how new CSAM keeps being found in the training data.
It's one of the many, many problems with generative AI
Can't ask them to sort that out. Are you anti-ai? That's a crime! /s
Easy answer is , they don't
Though that's just the one admitting to it.
A lightly more nuanced answer is , it probably depends, there's likely to be some inference made between age ranges but my guess is that it'd be sub-par given that it sometimes struggles with reproducing images it has a tonne of actual data for.
Are these people fucking stupid? AI can't remove something hardcoded to the image. The only way for it to "remove" it is by placing a different image over it, but since it has no idea what's underneath, it would literally just be making up a new image that has nothing to do with the content of the original. Jfc, people are morons. I'm disappointed the article doesn't explicitly state that either.
They think that the AI is smart enough to deduce from the pixels around it what the original face must have looked like, even though there's actually no reason why there should be a strict causal relationship between those things.
The black boxes would be impossible, but there are some types of blur that keep enough of the original data they can be undone. There was a pedofile that used a swirl to cover his face in pictures and investigators were able to unswirl the images and identify him.
With how the rest of it has gone it wouldn't surprise me if someone was incompetent enough to use a reversible one, although I have doubts Grok would do it properly.
Edit: this technique only works for video, but maybe if there are several pictures of the same person all blurred it could be used there too?
Yeah, but this type of machine learning and diffusion models used in image genAI are almost completely disjoint
Agree with you there. Just pointing out that in theory and with the right technique, some blurring methods can be undone. Grok most certainly is the wrong tool for the job.
Several years ago, authorities were searching the world for a guy who had been going around the world, molesting children, photographing them, and distributing them on the Internet. He was often in the photos, but he had chosen to use some sort of swirl blur on his face to hide it. The authorities just "unswirled" it, and there was his face, in all those photos of abused children.
They caught him soon after.
A swirl is a distortion that is non-destructive. Am anonymity blur averages out pixels over a wide area in a repetitive manner, which destroys information. Would it be possible to reverse? Maybe a little bit. Maybe one pixel out of every %, but there wouldn't be any way to prove the accuracy of that pixel and there would be massive gaps in information.
Swirl is destfuctive like almost everything in raster graphics with recompressing, but unswirling it back makes a good approximation in somehow reduced quality. If the program or a code of effect is known, e.g. they did it in Photoshop, you just drag a slider to the opposite side. Coming to think of it, it could be a nice puzzle in an adventure game or one another kind of captcha.
You're right. I meant more by "non-destructive" that it is, depending on factors like intensity and known algorithm, reversible.
There was someone who reported that due to the incompetence of whitehouse staffers, some of the Epstein files had simply been "redacted" in ms word by highlighting the text black, so people were actually able to remove the redactions by turning the pdf back into word and removing the black highlighting to reveal the text.
Who knows if some of the photos might be the same issue.
Hey! Cut it out! If those people could read, they'd be very upset!
unblur the face with 1000% accuracy
They have no idea how this models work :D

biblically accurate cw casting
CW? The TV show?
Barrett O'Brien
It’s the same energy as “don’t hallucinate and just say if you don’t know the answer”
and don't forget "make no mistakes" :D
Though it is 2026. Who's to say Elon didn't feed the unredacted files into grok while out of his face on ket 🙃
It feels like being back on the playground
"nuh uh, my laser is 1000% more powerful"
"oh yea, mine is ~~googleplex~~ googolplex percent more powerful"
Or percentages
I doubt any of these people are accessing X over Tor. Their accounts and IPs are known.
In a sane world, they'd be prosecuted.
In MAGAMERICA, they are protected by the Spirit of Epstein
What crime do you imagine they would be committing?
I don't know what they hope to gain by seeing the kid's face, unless they think they can match it up with an Epstein family member or something (seems unlikely to be their goal).