this post was submitted on 24 Nov 2025
26 points (96.4% liked)

Technology

4886 readers
257 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

Senior UK police officer says AI is accelerating violence against women and girls and that technology companies are complicit

top 9 comments
sorted by: hot top controversial new old
[–] p03locke@lemmy.dbzer0.com 8 points 3 weeks ago

The Guardian: We interviewed a police officer, from an organization most of the public doesn't trust, and he spouted off his opinion. Since he said the magic word "AI", we jumped all over it.

I'd like to know how this is actually "accelerating violence against women and girls". This is on the level of "video games promotes violence and creates serial killers" panic statements of the 80s and 90s.

[–] HubertManne@piefed.social 5 points 3 weeks ago (1 children)

I can see this. Someone had a comment in another thread that if its for private use its fine and I sorta get that but someone else made the point that ai should not have the info really. So people should really be concerned that ai can have image or video or audio input by users. They should have to use text or non image docs. If someone wants to deepfake and ex they will need to describe them.

[–] Ashtear@piefed.social 4 points 3 weeks ago (1 children)

Yes, if it was on a locally-hosted generative model, I wouldn't be bothered if someone did this in my likeness. That wouldn't be meaningfully different than using Photoshop to fake it ten years ago.

Passing it around to their friends and gods know whom else is still just as reprehensible though.

[–] HubertManne@piefed.social 3 points 3 weeks ago (1 children)

Yeah and photoshop things had the issue with passing around to. AI does have the issue of input being integrated into other output though. If someone asks for a redhead will it reference redheads various randos inputed into the system. Honestly whats scarier is if someone wants to touch up photos of their family and ai takes that as human images that make sense to use as reference for other porn material requests.

[–] Ashtear@piefed.social 2 points 3 weeks ago (1 children)

That's why I said local models. They aren't automatically taking outputs and training updates on them.

And yeah, we've all already had our likenesses folded in somewhere. That's the bigger problem here.

[–] HubertManne@piefed.social 1 points 3 weeks ago

ah. missed that. I think my brain mixed it in as part of private use rather than a private lm instance.

[–] sidebro@lemmy.zip 5 points 3 weeks ago (1 children)

I mean, the majority of deepfakes out there are so bad that I understand that some people don't care about them. But this stuff keeps getting better with time.

[–] Z3k3@lemmy.world 4 points 3 weeks ago

The bad stuff will never go away so people will think they can spot it while the really good stuff will be used for the dangerous stuff

And yes I do count myself as someone who will get it wrong. Just hope I can stave off that time for as long as possible

[–] Asidonhopo@lemmy.world 2 points 3 weeks ago

I think the argument will eventually end up in the uncanny valley, like obviously it's legal to draw a shitty little cartoon of someone engaged in sex but as it becomes closer to photorealism it becomes illegal? Is photoshopping someone's head onto a porn star illegal? I mainly dont like laws like this because it bring politicians that are ignorant about AI into legislating it on the basis of it being about scandalous sex related stuff, generally to sell more surveillance.