this post was submitted on 09 Mar 2026
200 points (98.1% liked)

World News

54580 readers
2697 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] lmmarsano@group.lt 6 points 6 hours ago (2 children)

Explain your objection. It's a parenting problem, not everyone else's.

[–] nymnympseudonym@piefed.social 6 points 2 hours ago

My objection is that its my operating system running on my computer

Not yours. MINE.

I can make its logic gates do anything I want, as long as it's not sending CP or malware over the Internet.

[–] tyler@programming.dev 33 points 5 hours ago (2 children)

Parents already have the tools to block this at the network layer, including in mobile OSes. There’s no need to add age verification at all to anything. The parents control their kids devices, so don’t give them a device they can access this stuff on.

These tools have existed for literal decades at this point. Anyone trying to add something now is just trying to make it easier for the government to spy on you.

[–] lmmarsano@group.lt 1 points 1 hour ago* (last edited 1 hour ago)

Cool: agreed. Your objection was ambiguous.

If we had to choose, though, I'd consider the professor's suggestion preferable to age verification. While I disagree with mandating it, it'd pretty much do nothing, because it's already reality: most mainstream OSs include parental controls. The "criteria" would establish standards for parental controls, which isn't altogether a bad idea. A better idea would be to promote a standard & replace mandates with public services to provide parental control technologies free & to educate parents.

In the late 90s, when US Congress attempted to regulate access of adult content to minors, those laws commissioned studies that drew similar conclusions even then. The studies & federal courts concluded that to meet the government's compelling interest in "protecting minors from harmful content", there were more narrowly tailored alternatives to criminalization & age verification that are less restrictive to fundamental rights & are at least as effective:

  • client-side filters to block content from the receiving end
  • government programs to train parents & provide them resources to "protect" their children from "harmful content"
  • public education campaigns.

They pointed out while client-side filters may have false positives & negatives

  • they can be monitored & corrected
  • they're a more complete solution that can restrict all internet protocols (not just web) from any geographic source (not only in legal jurisdiction) with content of any type (including dynamic such as live chat)
  • they allow restriction of other kinds of content (eg, violence, hate speech)
  • they can vary restrictions per child (eg, age-appropriateness)
  • they let parents disable them
  • they don't obstruct access by adults.

Criminalizing access to adult content at the source obstructs everyone's access & burdens them with loss of privacy & with security risk.

Despite their age, those studies' findings remain relevant.

  • COPA Commission

    In October 1998 Congress enacted the Child Online Protection Act and established the Commission on Online Child Protection to study methods to help reduce access by minors to certain sexually explicit material, defined in the statute as harmful to minors. Congress directed the Commission to evaluate the accessibility, cost, and effectiveness of protective technologies and methods, as well as their possible effects on privacy, First Amendment values and law enforcement. This report responds to the Congressional request.

  • National Research Council

    In November 1998, the U.S. Congress mandated a study by the National Research Council (NRC) to address pornography on the Internet (Box P.1).

COPA Commission summaryThe COPA Commission found Age Verification ID to have the highest adverse impact on cost, privacy, fundamental rights, and law enforcement and to score poorly on effectiveness and accessibility. They found other technologies & methods to be more effective & accessible with much lower adverse impact including

  • client-side filtering
  • family education programs
  • acceptable use policies
  • top-level domains for materials "not harmful" to minors
  • "greenspaces" containing only child-appropriate materials.

Some recommendations to highlight

Public Education:

  • Government and the private sector should undertake a major education campaign to promote public awareness of technologies and methods available to protect children online.
  • Government and industry should effectively promote acceptable use policies.

Consumer Empowerment Efforts:

  • Resources should be allocated for the independent evaluation of child protection technologies and to provide reports to the public about the capabilities of these technologies.
  • Industry should take steps to improve child protection mechanisms, and make them more accessible online.
  • A broad, national, private sector conversation should be encouraged on the development of next-generation systems for labeling, rating, and identifying content reflecting the convergence of old and new media.
  • Government should encourage the use of technology in efforts to make children's experience of the Internet safe and useful.

Industry Action:

  • The ISP industry should voluntarily undertake "best practices" to protect minors.
  • The online commercial adult industry should voluntarily take steps to restrict minors' ready access to adult content.
NRC summaryThe NRC found "no single or simple answer", agreed on the capabilities of filters in preventing inadvertent or unhighly-motivated exposure, but also stressed social & educational strategies in addressing motivation, coping, & responsible behavior. Social and educational strategies are intended to teach children how to make wise choices about how they behave on the Internet and to take control of their online experiences: where they go; what they see; what they do; who they talk to. Such strategies must be age-appropriate if they are to be effective. Further, such an approach entails teaching children to be critical, skeptical, and self-reflective of the material that they are seeing.

An analogy is the relationship between swimming pools and children. Swimming pools can be dangerous for children. To protect them, one can install locks, put up fences, and deploy pool alarms. All of these measures are helpful, but by far the most important thing that one can do for one’s children is to teach them to swim.

Perhaps the most important social and educational strategy is responsible adult involvement and supervision.

Internet safety education is analogous to safety education in the physical world, and may include teaching children how sexual predators and hate group recruiters typically approach young people, how to recognize impending access to inappropriate sexually explicit material, and when it is risky to provide personal information online. Information and media literacy provide children with skills in recognizing when information is needed and how to locate, evaluate, and use it effectively, irrespective of the media in which it appears, and in critically evaluating the content inherent in media messages. A child with these skills is less likely to stumble across inappropriate material and more likely to be better able to put it into context if and when he or she does.

Education, supervision, & parental controls/filters seem a more compelling solution. However, bring that up in regard to legislation to age-restrict social media & the tune at lemmy dramatically changes: seems inconsistent.

[–] IsoKiero@sopuli.xyz 3 points 3 hours ago

A lot of parents sadly lack any kinds of skills to use those tools nor even know that they exist. I'm not inherently against the approach where user agent sends some rough age (allowed R-rating or something) to the website which can then block minors from accessing porn/violence/whatever. If it was just that, locally stored info if the user is minor or adult, it could be a pretty decent approach to even technically less inclined parents to give some limits on what their kids can do.

But as with nearly every 'protect the kids' thing, it's a pretty damn slippery and steep slope. If adult verification requires something more than a local variable that's the point when the whole system becomes a tool for surveillance instead of a helpful thing for parents/schools and all of these "solutions" worldwide seems to be going in that direction.