this post was submitted on 13 Mar 2026
38 points (95.2% liked)

Technology

6585 readers
305 users here now

Which posts fit here?

Any news that are at least tangentially connected to the technology, social media platforms, informational technologies or tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

Europe on Friday took the first step towards outlawing artificial ​intelligence practices which generate child sexual ‌abuse material after EU governments proposed to add this provision to the bloc's landmark AI rules adopted ​two years ago.

you are viewing a single comment's thread
view the rest of the comments
[–] Grail@multiverse.soulism.net 2 points 1 month ago

https://www.sciencedirect.com/science/article/pii/S0145213424003673

In two anonymous surveys of CSAM consumers in the community, most (50–64 %) reported that they first viewed CSAM by accident, often while searching for other material online (Insoll et al., 2021; Napier et al., Forthcoming). The present study aims to build on this research by examining whether (a) accidental first exposure to CSAM can lead to subsequent intentional viewing and how often this occurs, and (b) first time intentional CSAM viewers are more likely to continue to view intentionally.

Respondents who intentionally searched for CSAM at first exposure (versus those who said they discovered it by accident) had 2.5 times the odds of viewing CSAM intentionally after first exposure (see Table 2). However, a substantial proportion of accidental first-time CSAM viewers went on to view CSAM intentionally (44.0 %, 167). Specifically, 144 of 284 males (50.7 %) and 15 of 76 females (19.7 %) who first discovered CSAM by accident said they then went on to view it intentionally. Hence, clarifying research question 3, that accidental first-time discovery of CSAM does lead to subsequent intentional viewing in some individuals.

Accidental exposure to CSAM led to subsequent intentional viewing for a sizeable proportion of respondents. While all genders are exposed to CSAM, males are more likely to be exposed and intentionally view it again after first exposure. Nevertheless, a quarter of female CSAM viewers also viewed CSAM intentionally after first exposure. Intervention initiatives that aim to prevent onset and escalation of CSAM consumption in the community should target all genders and consider the predictors identified in this study.

There's your study. You mentioned video games causing violence. Video games may not cause violence, but video games sure cause video games. People who try out a game and like it go on to become gamers. People who see child porn and like it go on to become CSAM offenders. Once a person becomes a regular CSAM consumer, there's a higher chance they go to the dark web and pay someone for this content. And at that point, we can see genuine harm.