this post was submitted on 28 Jul 2025
1491 points (99.6% liked)

Technology

73416 readers
4136 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Collective Shout, a small but vocal lobby group, has long called for a mandatory internet filter that would prevent access to adult content for everyone in Australia. Its director, Melinda Tankard Reist, was recently appointed to the stakeholder advisory board for the government’s age assurance technology trial before the under-16s social media ban comes into effect in Australia in December.

you are viewing a single comment's thread
view the rest of the comments
[–] Cethin@lemmy.zip 1 points 21 hours ago* (last edited 21 hours ago)

They have a risky move, which in 1/10000 cases leads to an illegal game being paid for through their payment platform.

And they have a safe move, where this never happens. Literally.

You're not getting it. They're the exact same risk. If it was illegal, it wasn't allowed before. If you're breaking the rules, you don't care. Especially if you were breaking the law and the rule before, you don't care that there's a new rule that also applies. This doesn't change risk at all. It doesn't make it any more unlikely, and certainly not "literally never happens."

The opposite could be true, if it were just against the rules but then is also made to be against the law. It might dissuade some people who were skirting the rules to reconsider. If they were breaking the law already, they don't care that they're breaking a new rule because they already were breaking the rules. It doesn't make it any worse for them. It's the exact same. If they're discovered, they're removed from the platform, exactly the same as before.

You must at least be able to understand this simple logic, right? Once you're breaking the rules enough to be removed from the platform, why do you care if there are more rules that will remove you from the platform? You're either stopped or you're not, and the platform either stops them or it doesn't. The risk to the payment processors is the same. You trust the moderation or you don't. They aren't going to do a better job because the illegal content is doubly not allowed. They're either stopping content that isn't allowed or they aren't.