this post was submitted on 28 Mar 2026
190 points (98.5% liked)

Technology

6677 readers
36 users here now

Which posts fit here?

Any news that are at least tangentially connected to the technology, social media platforms, informational technologies or tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] supersquirrel@sopuli.xyz 1 points 1 month ago* (last edited 1 month ago) (19 children)

The less you ask for, the more it can do.

Which is the entire ideology of the cult around AI.

Y'all want a world that gives you nothing you ask for while we are powerless to do anything about it.

You are wrong, or at least you are wrong to identify your beliefs as in the realm of rationality or science, what you are espousing are a set of religious beliefs in the power of AI that there is zero evidence AI will ever fulfill and damn if it isn't a lame and depressingly cynical religion.

It is clear when dealing with folks from your cult that proving AI is shit and that it has fundamental problems and limitations is irrelevant to the world view of the cult as it just proves to whatever particular cultist you are talking to that whatever particular subbranch of the cult they are on is the True Sect unlike the other subbranches stuck on the old teachings that aren't really magical.... and that the REAL AGI is just around the corner and this is a distraction.

It is the same nonsense problem you get when you start proclaiming dates for the end of the world in your religion and they keep inconveniently passing by without the world ending. To maintain your delusion you must divide up the religion and say "oh it was that Sect over there that was wrong, we have the true knowledge!". Rinse Repeat.

I am fine with you having different spiritual beliefs then me just don't waste everyones' time by trying to force people into thinking your religion is reality.

No one cares who isn't already part of your cult.

Sora shut down because this tech has moved on to local models running near real-time.

No, Sora shut down because AI is a bullshit business model that doesn't produce anything consistently of useful value other than the obsfucation of theft or responsibility while it consumes a vast amount of resources to accomplish what a moderate amount of humans with food, water, shelter and love could make far more efficiently and with far more soul.

I regularly hear normal people describe shitty, fake things that come off hollow as "like AI", you can see how much of a false future AI is in how little people like what it makes almost as a rule. The reason normal people hate AI is because it is so suffocatingly often a boring black box that spits out sloppy unoriginal crap chopped up from stolen human labor, something most normal people are used to identifying in the hollow structures of society around them.

[–] FreedomAdvocate -1 points 1 month ago (9 children)

The only “cult” around AI is the anti-AI cult. The rest of us just acknowledge the reality that AI is now an everyday tool to use, and it’s revolutionising the world at a break-neck pace. It’s turning entire industries on their heads, doing in minutes for a handful of dollars what a year ago took 300 people millions of dollars to do.

The genie is out of the bottle, and it’s not going back in ever again. The genie is only going to get more powerful by leaps and bounds. The anti-AI crowd are going to be left behind, unemployed, and even worse for them - unemployable.

[–] supersquirrel@sopuli.xyz 1 points 1 month ago* (last edited 1 month ago) (8 children)

The rest of us just acknowledge the reality that AI is now an everyday tool to use, and it’s revolutionising the world at a break-neck pace.

Except it isn't? Most of us who don't worship techbros like you don't think highly of the quality of output of AI, it has become common parlance for people to describe fake and hollow feeling things as "like AI" and I agree with the aesthetic label, y'all are just too blind to see it while you try to force it down our throats. We are in amid a massive economic bubble with AI that is about to burst given that almost no AI companies are profitable and they consume an incredible amount of energy.

You are fantasizing about a religion, great, you can believe in whatever you want but stop making a clown out of yourself by pretending what you are espousing isn't a set of religious beliefs with no hard evidence to support the magical thinking they demand.

[–] FreedomAdvocate -2 points 1 month ago (1 children)

No one is “worshipping tech bros” 🤣. Most people like what AI is bringing to the world. It makes lots of jobs infinitely easier. It opens new doors for people, doors that previously were locked shut with a million padlocks and booby traps.

Like I said, the only cult like behaviour is from people like you.

[–] supersquirrel@sopuli.xyz 1 points 1 month ago (1 children)

Most people like what AI is bringing to the world.

Where is your proof of this?

[–] FreedomAdvocate -1 points 1 month ago* (last edited 1 month ago) (1 children)

Where’s your proof?

My evidence would be the absolutely massive and widespread adoption of all things AI. Yours would be…..?

[–] supersquirrel@sopuli.xyz 1 points 1 month ago* (last edited 1 month ago) (1 children)

These things aren’t getting built, or if they’re getting built, it’s taking way, way longer than expected, which means that interest on that debt is piling up. The longer it takes, the less rational it becomes to buy further NVIDIA GPUs — after all, if data centers are taking anywhere from 18 months to three years to build, why would you be buying more of them? Where are you going to put them, Jensen?

This also seriously brings into question the appetite that private credit and other financiers have for funding these projects, because much of the economic potential comes from the idea that these projects get built and have stable tenants. Furthermore, if the supply of AI compute is a bottleneck, this suggests that when (or if) that bottleneck is ever cleared, there will suddenly be a massive supply glut, lowering the overall value of the data centers in progress…which are, by the way, all filled with Blackwell GPUs, which will be two or three-years-old by the time the data centers are finally turned on.

I also wonder whether the demand actually exists to make any of this worthwhile, or what people are actually paying for this compute.

If we assume 3GW of IT load capacity was brought online in America, that should (theoretically) mean tens of billions of dollars of revenue thanks to the “insatiable demand for AI” — except nobody appears to be showing massive amounts of revenue from these data centers.

https://www.wheresyoured.at/the-ai-industry-is-lying-to-you/

Although there has been between $30 and $40 billion in enterprise investment into generative AI, a recent MIT report shows that 95 percent of organizations are seeing zero return.

Just 5 percent of integrated artificial intelligence pilots “are extracting millions in value,” while the majority contribute no measurable impact to profits, the report found.

https://thehill.com/policy/technology/5460663-generative-ai-zero-returns-businesses-mit-report/

In this study, we show that, despite the ubiquity of AI-generated content, it does not perform well in search and answer engines:

86% of articles ranking in Google Search are written by humans, and only 14% are generated using AI.

82% of articles cited by ChatGPT & Perplexity are written by humans, and only 18% are generated using AI.

When AI-generated articles do appear in Google Search, they tend to rank lower than human-written articles.

https://graphite.io/five-percent/ai-content-in-search-and-llms

They found that people perceived AI scientists more negatively than climate scientists or scientists in general, and that this negativity is driven by concern about AI scientists’ prudence – specifically, the perception that AI science is causing unintended consequences. The researchers also examined whether these negative perceptions might be a result of AI being so new and unknown, but found that public perceptions of AI science and scientists did not significantly improve from 2024 to 2025, even as AI became a more common presence in everyday life.

https://www.asc.upenn.edu/news-events/news/ai-perceived-more-negatively-climate-science-or-science-general

AI is also a threat towards luring people into psychosis because it pathologically confirms every impulse you have, so trying to argue everyone loves AI is going to backfire on you. Everyone loved cigarettes too when they were a new thing. People still love cigarettes, that only proves they are addictive.

We find that sycophancy is both prevalent and harmful. Across 11 AI models, AI affirmed users’ actions 49% more often than humans on average, including in cases involving deception, illegality, or other harms. On posts from r/AmITheAsshole, AI systems affirm users in 51% of cases where human consensus does not (0%). In our human experiments, even a single interaction with sycophantic AI reduced participants’ willingness to take responsibility and repair interpersonal conflicts, while increasing their own conviction that they were right. Yet despite distorting judgment, sycophantic models were trusted and preferred. All of these effects persisted when controlling for individual traits such as demographics and prior familiarity with AI; perceived response source; and response style. This creates perverse incentives for sycophancy to persist: The very feature that causes harm also drives engagement.

https://www.science.org/doi/10.1126/science.aec8352

The study, published Thursday in the journal Science, tested 11 leading AI systems and found they all showed varying degrees of sycophancy — behavior that was overly agreeable and affirming. The problem is not just that they dispense inappropriate advice but that people trust and prefer AI more when the chatbots are justifying their convictions.

https://www.rochesterfirst.com/science/ap-ai-is-giving-bad-advice-to-flatter-its-users-says-new-study-on-dangers-of-overly-agreeable-chatbots/

The productivity gains aren't there for AI, the business use cases aren't actually there for AI, people are beginning to associate AI with "Slop" more and more as they realize how boring and poor quality content AI makes... and even in Google's own search engine rankings AI written content barely makes it anywhere near the top because it scores so low for relevance and engagement to people.

Oh yeah and again AI sends people into psychosis by putting people into echo chambers, so defending AI as likable isn't even a rational defense for it in the same way arguing a Venus Fly Trap tastes good to a Fly to encourage the Fly to step on in is a poor argument.

[–] FreedomAdvocate -1 points 3 weeks ago (1 children)

🤣 you really used articles using search engine rankings to try and discredit AI 🤦

Nothing you posted there is proof that people don’t want or use AI. AI is in use almost everywhere now.

[–] supersquirrel@sopuli.xyz -1 points 3 weeks ago (1 children)

you really used articles using search engine rankings to try and discredit AI

If I am speaking of relevance to a query, yes I am. Search Engine Rankings are the most widely distributed measure of practical relevance to people of content we have.

Actually everything I have provided helps build the case that people don't want AI and AI isn't used in a way by most people that actually brings revolutionary value.

[–] FreedomAdvocate 0 points 3 weeks ago (1 children)

AI is replacing search engines. AI doesn’t show up in search engines lol.

[–] supersquirrel@sopuli.xyz 0 points 3 weeks ago

I know that is what tech companies think will happen, but it massively misunderstands the role of a search engine as an input into any broad llm type tool.

Search engines are going nowhere, or rather if they die it is not because they were made irrelevant it is because of overconsolidation in the market (Google).

load more comments (6 replies)
load more comments (6 replies)
load more comments (15 replies)