this post was submitted on 13 Aug 2025
439 points (95.3% liked)

Technology

74055 readers
3387 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] TAG@lemmy.world 7 points 1 day ago (4 children)

The article argues that extremist views and echo chambers are inherent in public social networks where everyone is trying to talk to everyone else. That includes Fediverse networks like Lemmy and Mastodon.

They argue for smaller, more intimate networks like group chats among friends. I agree with the notion, but I am not sure how someone can build these sorts of environments without just inviting a group of friends and making an echo chamber.

There's actually some interesting research behind this - Dunbar's number suggests humans can only maintain about 150 meaningful relationships, which is why those smaller networks tend to work better psychologicaly than the massive free-for-alls we've built.

load more comments (3 replies)
[–] 9point6@lemmy.world 62 points 2 days ago (4 children)

Meta and twitter cease to exist tomorrow and 99% of the issues are solved IMO

The fediverse is social media and it doesn't have anything close to the same kinds of harmful patterns

lemmy does have problems though. Lots of emotional, judgemental and brigading content still. But it's less here than elsewhere, probably.

[–] Nougat@fedia.io 59 points 2 days ago (4 children)

It's almost like the problem isn't social media, but the algorithms that put content in front of your eyeballs to keep your engagement in order to monetize you. Like a casino.

[–] Jason2357@lemmy.ca 8 points 2 days ago

Facebook was pretty boring before they tried to make money. Still ick, but mostly just people posting pictures of activities with family or friends.

load more comments (3 replies)
[–] chaosCruiser@futurology.today 11 points 2 days ago* (last edited 2 days ago) (4 children)

Amazon, Google and Microsoft would still be there, so the Internet seems to be suffering from a metastatic cancer at this point. Cutting off two revolting lumps helps, but the prognosis doesn’t look that great.

load more comments (4 replies)
load more comments (1 replies)
[–] kalkulat@lemmy.world 28 points 2 days ago

Of course -corporate- social media can't be fixed ... it already works exactly they way they want it to...

[–] Xanthrax@lemmy.world 29 points 2 days ago

We're on the solution right now, lmao

[–] mctoasterson@reddthat.com 20 points 2 days ago (3 children)

I think just going back to internet forums circa early 2000s is probably a better way to engage honestly. They're still around, just not as "smartphone friendly" and doomscroll-enabled, due to the format.

I'm talking stuff like SomethingAwful, GaiaOnline, Fark, Newgrounds forum, GlockTalk, Slashdot, vBulletin etc.

These types of forums allowed you to discuss timely issues and news if you wanted. You could go a thousand miles deep on some bizarre subculture or stick to general discussion. They also had protomeme culture before that was a thing - aka "embedded image macros".

[–] Jason2357@lemmy.ca 8 points 2 days ago (1 children)

Anything that is topic focussed rather than following individuals is a big difference, and then take away the engagement algorithm and it’s much better.

This is a good point. It's like asking the question: "What is more important in politics? People, or ideas?"

People respond very differently to that. To some it's people, and to some it's ideas. That is why you have Xitter-like microblogging which is focused around people, and reddit-like communities which are focused around topics/ideas.

load more comments (2 replies)
[–] Perspectivist@feddit.uk 10 points 2 days ago (3 children)

Ofcourse not. The issue with social media are the people. Algorithms just bring out the worst in us but it didn't make us like that, we already were.

[–] grrgyle@slrpnk.net 5 points 1 day ago (1 children)

From my point of view something that brings out the worst in us sounds like a really big part of the issue.

We've always been modified by our situations, so why not create better situations rather than lamenting that we don't have the grit to break through whatever toxic society we find ourselves graphed onto?

Sorry I know I'm putting a lot on your comment that I know you didn't mean, but I see this kind of unintentional crypto doomerism a lot. I think it holds people to an unhealthy standard.

load more comments (1 replies)
[–] gandalf_der_12te@discuss.tchncs.de 1 points 1 day ago* (last edited 1 day ago)

The reason why it brings out the worst in people is because it has open borders. You can shit into the network and move on. If you were forced to stay and live with your shit, you'd shit less into the public domain. That means small networks, harder to move to other/new networks, ...

load more comments (1 replies)
[–] avidamoeba@lemmy.ca 14 points 2 days ago* (last edited 2 days ago)

Uhm, I seem to recall that social media was actually pretty good in the late 2000s and early 2010s. The authors used AI models as the users. Could it be that their models have internalized the effects of the algorithms that fundamentally changed social media from what it used to be over a decade ago, and then be reproducing those effects in their experiments? Sounds like they're treating models as if they're humans, and they are not. Especially when it comes to changing behaviour based on changes in the environment, which is what they were testing by trying different algorithms and mitigation strategies.

[–] tacosanonymous@mander.xyz 9 points 2 days ago

Neat.

Release the epstein files then burn it all down.

[–] Zak@lemmy.world 12 points 2 days ago

The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren't people, and the authors have not convinced me that they will behave like people in this context.

The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There's no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.

I mostly use social media to share pictures of birds. This contributes to some of the problems the source article discusses. It causes fragmentation; people who don't like bird photos won't follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict.

Social media was a mistake, tbh

[–] General_Effort@lemmy.world 9 points 2 days ago (1 children)

The original source is here:

https://arxiv.org/abs/2508.03385

Social media platforms have been widely linked to societal harms, including rising polarization and the erosion of constructive debate. Can these problems be mitigated through prosocial interventions? We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms. We create a minimal platform where agents can post, repost, and follow others. We find that the resulting following-networks reproduce three well-documented dysfunctions: (1) partisan echo chambers; (2) concentrated influence among a small elite; and (3) the amplification of polarized voices – creating a “social media prism” that distorts political discourse. We test six proposed interventions, from chronological feeds to bridging algorithms, finding only modest improvements – and in some cases, worsened outcomes. These results suggest that core dysfunctions may be rooted in the feedback between reactive engagement and network growth, raising the possibility that meaningful reform will require rethinking the foundational dynamics of platform architecture.

load more comments (1 replies)
[–] AceFuzzLord@lemmy.zip 3 points 1 day ago

I mean, I feel like just shutting it down would solve at least some problems. Shuttering it all, video sharing platforms included.

Not a situation most anyone would agree on, but it's an idea.

load more comments
view more: ‹ prev next ›