this post was submitted on 28 Jul 2025
1474 points (99.6% liked)

Technology

73416 readers
4260 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Collective Shout, a small but vocal lobby group, has long called for a mandatory internet filter that would prevent access to adult content for everyone in Australia. Its director, Melinda Tankard Reist, was recently appointed to the stakeholder advisory board for the government’s age assurance technology trial before the under-16s social media ban comes into effect in Australia in December.

you are viewing a single comment's thread
view the rest of the comments
[–] altima_neo@lemmy.zip 110 points 1 day ago (2 children)

And we're the ones spending the money

[–] echodot@feddit.uk 23 points 1 day ago

That's really what I don't get. Why make it impossible for people to give you money. That doesn't seem to be the way capitalism is supposed to operate if something is popular then you should allow it.

[–] reactionality@lemmy.sdf.org 3 points 1 day ago* (last edited 1 day ago) (1 children)

They're the ones at risk of losing money if they get sued by reintroducing said content. You're not going to stop using the payment processors because there's literally no other option. This is performative.

[–] Cethin@lemmy.zip 6 points 1 day ago (1 children)

Sued for what? They aren't stopping illegal content from being sold. That, as is implied by the word "illegal", was already not allowed on these stores. They're stopping legal, but potentially (not my opinion) objectionable, content from being sold. There's no legal risk for allowing it.

[–] reactionality@lemmy.sdf.org -1 points 7 hours ago (1 children)

I'm not saying there is illegal content. Read my comment.

I'm saying the possibility of there being illegal content only exists if they allow the reintroduction of those titles. They'd need trust in the store moderation, in the lack of bad faith actors, in a lot of things.

And it would be an absolutely stupid business decision for them.

I am NOT condoning what they did, nor what they are doing. I am explaining, from their business perspective, why allowing potentially illegal content back on the platform is a non-argument and you cannot convince them otherwise.

[–] Cethin@lemmy.zip 3 points 7 hours ago (1 children)

I'm saying the possibility of there being illegal content only exists if they allow the reintroduction of those titles.

Again, no. If there were illegal content before then it's already breaking the rules. If you're breaking rules once, why would adding more rules change anything?

They'd need trust in the store moderation, in the lack of bad faith actors, in a lot of things.

What? Yeah, the store moderators have to enforce the rules. I don't know what this has to do with anything. Illegal or just banned, they have to be removed by the moderators. What difference does it make? This doesn't make any sense. Adding more rules doesn't magically remove the content. Moderators still have to do it. If they weren't doing it for illegal content, why would they do it for only banned but legal content?

The reason they did it is because they were pressured by a weird group who has a lot of influence. It wasn't because they were worried about illegal content, which is obvious because that's not the rule they applied. If the rule was "you're not allowed to sell illegal content" (which is obviously always true) then it'd be fine. Instead they made a rule for not allowing specific types of legal content.

[–] reactionality@lemmy.sdf.org -2 points 5 hours ago (1 children)

You're not great at risk assessment, are you?

They have a risky move, which in 1/10000 cases leads to an illegal game being paid for through their payment platform.

And they have a safe move, where this never happens. Literally.

If the expected risk is positive in case 1, they will opt for case 2.

You must at least be able to understand this simple logic, right? If not, then I'm afraid this conversation is over because you're not even remotely trying to understand their logic, and you're just looking for a reason to be mad. Your irrationality makes me nauseous.

[–] Cethin@lemmy.zip 1 points 2 hours ago* (last edited 2 hours ago)

They have a risky move, which in 1/10000 cases leads to an illegal game being paid for through their payment platform.

And they have a safe move, where this never happens. Literally.

You're not getting it. They're the exact same risk. If it was illegal, it wasn't allowed before. If you're breaking the rules, you don't care. Especially if you were breaking the law and the rule before, you don't care that there's a new rule that also applies. This doesn't change risk at all. It doesn't make it any more unlikely, and certainly not "literally never happens."

The opposite could be true, if it were just against the rules but then is also made to be against the law. It might dissuade some people who were skirting the rules to reconsider. If they were breaking the law already, they don't care that they're breaking a new rule because they already were breaking the rules. It doesn't make it any worse for them. It's the exact same. If they're discovered, they're removed from the platform, exactly the same as before.

You must at least be able to understand this simple logic, right? Once you're breaking the rules enough to be removed from the platform, why do you care if there are more rules that will remove you from the platform? You're either stopped or you're not, and the platform either stops them or it doesn't. The risk to the payment processors is the same. You trust the moderation or you don't. They aren't going to do a better job because the illegal content is doubly not allowed. They're either stopping content that isn't allowed or they aren't.