this post was submitted on 09 Dec 2025
549 points (99.1% liked)

World News

51252 readers
3763 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

Australia has enacted a world-first ban on social media for users aged under 16, causing millions of children and teenagers to lose access to their accounts.

Facebook, Instagram, Threads, X, YouTube, Snapchat, Reddit, Kick, Twitch and TikTok are expected to have taken steps from Wednesday to remove accounts held by users under 16 years of age in Australia, and prevent those teens from registering new accounts.

Platforms that do not comply risk fines of up to $49.5m.

There have been some teething problems with the ban’s implementation. Guardian Australia has received several reports of those under 16 passing the facial age assurance tests, but the government has flagged it is not expecting the ban will be perfect from day one.

All listed platforms apart from X had confirmed by Tuesday they would comply with the ban. The eSafety commissioner, Julie Inman Grant, said it had recently had a conversation with X about how it would comply, but the company had not communicated its policy to users.

Bluesky, an X alternative, announced on Tuesday it would also ban under-16s, despite eSafety assessing the platform as “low risk” due to its small user base of 50,000 in Australia.

Parents of children affected by the ban shared a spectrum of views on the policy. One parent told the Guardian their 15-year-old daughter was “very distressed” because “all her 14 to 15-year-old friends have been age verified as 18 by Snapchat”. Since she had been identified as under 16, they feared “her friends will keep using Snapchat to talk and organise social events and she will be left out”.

Others said the ban “can’t come quickly enough”. One parent said their daughter was “completely addicted” to social media and the ban “provides us with a support framework to keep her off these platforms”.

“The fact that teenagers occasionally find a way to have a drink doesn’t diminish the value of having a clear, ­national standard.”

Polling has consistently shown that two-thirds of voters support raising the minimum age for social media to 16. The opposition, including leader Sussan Ley, have recently voiced alarm about the ban, despite waving the legislation through parliament and the former Liberal leader Peter Dutton championing it.

The ban has garnered worldwide attention, with several nations indicating they will adopt a ban of their own, including Malaysia, Denmark and Norway. The European Union passed a resolution to adopt similar restrictions, while a spokesperson for the British government told Reuters it was “closely monitoring Australia’s approach to age restrictions”.

you are viewing a single comment's thread
view the rest of the comments
[–] Arcane2077@sh.itjust.works 136 points 3 days ago* (last edited 3 days ago) (5 children)

Some good silver linings here, but what everyone needs to remember here is that nobody would be supporting this at all if facebook wasn’t intentionally predatory and bad for (all) people’s brains.

If regulators in Australia had a spine they would call for an end to those practices, and now that’s infinitely harder to do

[–] ms_lane@lemmy.world 39 points 3 days ago (3 children)

Some good silver linings here

Where?

The kids will move to less monitored platforms and even on things like YouTube, parental controls are now gone.

You need to have an account for parental controls to be applied to, kids aren't allowed an account, vis-a-vis, no more parental controls or monitoring for problem content.

[–] wheezy@lemmy.ml 36 points 2 days ago* (last edited 2 days ago) (2 children)

As someone that grew up with an "unmonitored" internet. I can say that it was significantly more healthy than the profit driven "keep watching" algorithm that is all of social media today.

Yeah. I saw "two girls one cup" and "lemon party". But, did I slowly have my perspective of reality changed by the 30 second videos I swiped on for hours at a time for days on end?

No, most of my time was spent learning about computers, "stealing" music, and chatting with my real life friends.

I don't think a kid today can experience that internet anymore. It's gone. But acting like "unmonitored" internet access is worse is pearl clutching and ignoring the fundamental problems the profit driven internet has created at the expense of societies mental health.

Kids will absolutely find another place to connect online in Australia. But, honestly, I think whatever that is will be healthier than the absolute brain rot that is profit driven social media.

We got to this point because parents think that kids need a monitored internet. Afraid of online predators. So it was passed off to corporations that learned how to systematically institute mental abuse in order to keep their apps open longer.

[–] noobdoomguy8658@feddit.org 8 points 2 days ago

I just wanna say hi, and I remember those days, too.

For a long time, I couldn't understand people saying they hate the Internet or their phone or anything like that, because I had been having a blast for so long and thought it was one of the most vibrant, fun, educational and useful part of my life that has taught me a lot.

But at some point I found myself scrolling the same site for hours, trying to tear my eyes off screen and telling myself that I wasn't enjoying myself and that I should stop, but I just couldn't. That's when I finally understood.

I try to bring back intention to this. I think what I want to do online first before I do it -- what topic to look for when I want to watch a video, what kind of news or discourse I want to read, what's that on my mind that I want to share. Talking to my peers, I often feel like this kind of approach has long been lost to not thinking for yourself and wanting entertainment to just sort of happen to you, predict what you want, guess.

Big money figuring out the Internet has been a very bad thing.

[–] The_Decryptor@aussie.zone 7 points 2 days ago

You need to have an account for parental controls to be applied to, kids aren’t allowed an account, vis-a-vis, no more parental controls or monitoring for problem content.

Except that YT hides pretty much everything interesting behind a login wall these days.

I tried to listen to a Daft Punk song yesterday in a private tab and was blocked.

[–] wheezy@lemmy.ml 9 points 2 days ago

It's a bandaid. And just like previous attempts like this all this will do is make Australian kids better at circumventing the censorship or using an alternative website. Which, honestly, is probably a positive in and of itself. I'd much rather my kid be visiting some random forum type website (like I grew up with) then the absolute brain rot that is social media algorithms.

Seeing "lemon party" posted before the mods removed it definitely fucked me up less than the slop being fed into the brains of teenagers on social media today.

[–] porcoesphino@mander.xyz 16 points 3 days ago* (last edited 3 days ago) (2 children)

I think that's easier said than done. There are a lot of negatives associated with social media and some are easier to put restrictions on (say violent content) but I don't think we really have a good grasp of all the ways use is associated with depression for example. And wouldn't some of this still fall back to age restricted areas, kind of like with movies?

But yeah, it would be nice to see more push back on the tech companies instead of the consumers

[–] The_v@lemmy.world 16 points 3 days ago (2 children)

Its a very simple fix with a few law changes.

  1. The act of promoting or curating user submitted data makes the company strictly liable for any damages done by the content.

  2. The deliberate spreading of harmful false information makes the hosting company liable for damages.

This would bankrupt Facebook, Twitter, etc within 6 months.

[–] Attacker94@lemmy.world 7 points 3 days ago (2 children)

The act of promoting or curating user submitted data makes the company strictly liable for any damages done by the content.

I assume you don't mean simply providing the platform for the content to be hosted, in that case I agree this would definetly help.

The deliberate spreading of harmful false information makes the hosting company liable for damages.

This one is damn near impossible to enforce for the sole reason of the word "deliberate", the issue is that I would not support such a law without that part.

[–] T156@lemmy.world 4 points 2 days ago

This one is damn near impossible to enforce for the sole reason of the word "deliberate", the issue is that I would not support such a law without that part.

It would also be easily abused, especially since someone would have to take a look and check, which would already put a bottleneck in the system, and the social media site would have to take it down to check, just in case, which gives someone a way to effectively remove posts.

[–] The_v@lemmy.world 3 points 2 days ago

I left out the hosting part for just that reason. The company has to activately do something to gain the liability. Right now the big social media companies are deliberately prioritizing harmful information to maximize engagement and generate money.

As for enforcement hosters have had to develop protocols for removal of illegal content since the very beginning. Its still out there and can be found, but laws and mostly due diligence from hosters, makes it more difficult to find. Its the reason Lemmy is not full of illegal pics etc. The hosters are actively removing it and banning accounts that publish it.

Those protocols could be modified to include obvious misinformation bots etc. Think about the number of studies that have shown that just a few accounts are the source of the majority of harmful misinformation on social media.

Of course any reporting system needs to be protected from abuse. The DMCA takedown abusers are a great example of why this is needed.

[–] porcoesphino@mander.xyz 1 points 2 days ago

That kind of aligns with some actions I would love to see but I don't really see how it helps in the example I used to highlight some of the harder things to fix, depression. How does that improve the correlation between social media use and depression in teenagers? I can see it will improve from special cases like removing posts pro eating disorder content but I'm pretty confident the depression correlation goes well beyond easy to moderate content.

Also, if we presumed that some amount of horrific violence is okay for adults to choose to see and a population of people thinks its reasonable to restrict this content for people below a certain age (or swap violence for sex / nudity) then do we just decide we know better than that population, that freedom is more important, or does it fall back to age restrictions again (but gated on parts of the site)? I'm avoiding saying "government" here and going with "population of people" to try to decouple a little from some of the negatives people associate with government, especially since COVID

But yeah, holding tech companies accountable like that would be lovely to see. I suspect the cost is so large they couldn't pay so it would never happen, but I think that's because society has been ignoring their negative externalities for so long they're intrenched

[–] Arcane2077@sh.itjust.works 1 points 3 days ago (1 children)

Oh definitely not easy, my point is that it’s even harder now

[–] porcoesphino@mander.xyz 1 points 3 days ago (1 children)

Why do you say it's harder now?

[–] HK65@sopuli.xyz 7 points 3 days ago (1 children)

You can't use the think of the children line

[–] porcoesphino@mander.xyz 1 points 2 days ago* (last edited 2 days ago)

True, but there is momentum. It's empowering other countries and that could lead to a second pass at legislation in Aus after its not so outlandish or it could lead to another country doing something better and then Aus copying after the costly validation was done by someone else. I think waiting for perfect legislation likely leads to what we've had for a while and that's even less / very little push back on tech companies