this post was submitted on 21 Jan 2026
584 points (98.3% liked)

Technology

78923 readers
3402 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Workers should learn AI skills and companies should use it because it's a "cognitive amplifier," claims Satya Nadella.

in other words please help us, use our AI

top 50 comments
sorted by: hot top controversial new old
[–] FreddiesLantern@leminal.space 49 points 2 hours ago (3 children)

How can you lose social permission that you never had in the first place?

[–] JoeBigelow@lemmy.ca 15 points 1 hour ago (1 children)

The peasants might light their torches

[–] tempest@lemmy.ca 5 points 1 hour ago (1 children)

Datacenters are expensive and soft targets.

[–] Knock_Knock_Lemmy_In@lemmy.world 1 points 9 minutes ago

Dude, building are pretty hard.

[–] rumba@lemmy.zip 7 points 1 hour ago

There's a latency between asking for forgiveness and being demanded to stop.

[–] lando55@lemmy.zip 3 points 1 hour ago

It's easier to beg for social forgiveness than it is to ask for social permission

[–] kameecoding@lemmy.world 20 points 1 hour ago (4 children)

I will try to have a balanced take here:

The positives:

  • there are some uses for this "AI"
  • like an IDE it can help speed up the process of development especially for menial tasks that are important such as unit test coverage.
  • it can be useful to reword things to match the corpo slang that will make you puke if you need to use it.
  • it is useful as a sort of better google, like for things that are documented but reading the documentation makes your head hurt so you can ask it to dumb it down to get the core concept and go from there

The negatives

  • the positives don't justify the environmental externalities of all these AI companies
  • the positives don't justify the pc hardware/silicone price hikes
  • shoehorning this into everything is capital R retarded.
  • AI is a fucking bubble keeping the Us economy inflated instead of letting it crash like it should have a while ago
  • other than a paid product like copilot there is simply very little commercially viable use-case for all this public cloud infrastructure other than targeting with you more ads, that you can't block because it's in the text output of it.

Overall I wish the AI bubble burst already

[–] ViatorOmnium@piefed.social 19 points 1 hour ago (1 children)

menial tasks that are important such as unit test coverage

This is one of the cases where AI is worse. LLMs will generate the tests based on how the code works and not how it is supposed to work. Granted lots of mediocre engineers also use the "freeze the results" method for meaningless test coverage, but at least human beings have ability to reflect on what the hell they are doing at some point.

[–] JoeBigelow@lemmy.ca 1 points 1 hour ago

I think machine learning has a vast potential in this area, specifically things like running iterative tests in a laboratory, or parsing very large data sets. But a fuckin LLM is not the solution. It makes a nice translation layer, so I don't need to speak and understand bleep bloop and can tell it what I want in plain language. But after that LLM seems useless to me outside of fancy search uses. It's should be the initial processing layer to figure out what type of actual AI (ML) to utilize to accomplish the task. I just want an automator that I can direct in plain language, why is that not what's happening? I know that I don't know enough to have an opinion but I do anyway!

[–] rumba@lemmy.zip 5 points 1 hour ago

They f'd up with electricity rates and hardware price hikes. They were getting away with it by not inconveniencing enough laymen.

[–] arendjr@programming.dev 2 points 1 hour ago

So I’m the literal author of the Philosophy of Balance, and I don’t see any reason why LLMs are deserving of a balanced take.

This is how the Philosophy of Balance works: We should strive…

  • for balance within ourselves
  • for balance with those around us
  • and ultimately, for balance with Life and the Universe at large

But here’s the thing: LLMs and the technocratic elite funding them are a net negative to humanity and the world at large. Therefore, to strive for a balanced approach towards AI puts you on the wrong side of the battle for humanity, and therefore human history.

Pick a side.

[–] Schal330@lemmy.world 1 points 56 minutes ago

it is useful as a sort of better google, like for things that are documented but reading the documentation makes your head hurt so you can ask it to dumb it down to get the core concept and go from there

I agree with this point so much. I'm probably a real thicko, and being able to use it to explain concepts in a different way or provide analogies has been so helpful for my learning.

I hate the impact from use of AI, and I hope that we will see greater efficiencies in the near future so there is less resource consumption.

[–] morto@piefed.social 54 points 2 hours ago (3 children)
  • Denial
  • Anger
  • Bargaining <- They're here
  • Depression
  • Acceptance
[–] rumba@lemmy.zip 16 points 1 hour ago (1 children)

The five stages of corporate grief:

  • lies
  • venture capital
  • marketing
  • circular monetization
  • private equity sale
[–] lando55@lemmy.zip 1 points 1 hour ago (2 children)

Where do the three envelopes fit in

[–] rumba@lemmy.zip 2 points 1 hour ago

Roll 2d6 on private equity sale. 7 or higher and you get to ride again with an IPO at position 6. 1-6 and you get to fill out the envelopes.

[–] elvith@feddit.org 1 points 1 hour ago

In my pocket

[–] FilthyShrooms@lemmy.world 8 points 1 hour ago

Denial: "AI will be huge and change everything!"

Anger: "noooo stop calling it slop its gonna be great!"

Bargaining: "please use AI, we spent do much money on it!"

Depression: companies losing money and dying (hopefully)

Acceptance: everyone gives up on it (hopefully)

[–] Chais@sh.itjust.works 7 points 1 hour ago* (last edited 1 hour ago) (1 children)

Which seems like good progress. I feel like they were in denial not three weeks ago.

[–] arendjr@programming.dev 1 points 1 hour ago

May the depression be long lasting and heartfelt in the United States of AI.

[–] fartographer@lemmy.world 3 points 1 hour ago

Must do something useful? You're the one selling the damn thing. You can't build a Pinto and then tell people "we have to stop burning to death or we'll lose permission to keep production faulty cars."

There is something inherently wrong with your product, and you can't even fix it because you're too busy shoving it down everyone's throats.

It's like you're trying to bake cookies using pieces of every plagiarized baking recipe, whether or not they're related. Then, before you've actually tasted the cookies, you're telling everyone to reach into the oven and try using this "basic" cookie to modify and make their own cookies.

Except the cookies haven't even baked yet. And before you've ever tasted a single fully baked cookie, you're announcing modifications to your cookie dough recipe based on feedback from your previously undercooked, improperly made cookies.

Go back to small scale. Let people bake their own cookies at home, and report what they've discovered. Try upscaling those recipes, and see if you can make any parts more efficient.

And quit telling people to eat your tainted cookies that are poisoning everyone, and then telling them that if they don't start enjoying your cookies soon, then you're gonna have to shut down your factory.

Your cookie/Pinto/AI venture deserves to be shut down. Take the L, learn from it, and try again after you figure out how to get it right. Bake a better cookie instead of trying to make better consumers.

[–] DaddleDew@lemmy.world 41 points 2 hours ago* (last edited 2 hours ago)

Translation: Microslop's executives are finally starting to realize that they fucked up.

[–] circuitfarmer@lemmy.sdf.org 20 points 2 hours ago

Textbook definition of a solution searching for a problem.

[–] 1984@lemmy.today 2 points 1 hour ago

They have nothing consumers want.

[–] rizzothesmall@sh.itjust.works 1 points 42 minutes ago

I really respected Nadella for pushing developer experience and championing opening up the source of .Net. I was like "There's a guy who knows his business and his audience"

This, tho... Come on dude.

[–] Fizz@lemmy.nz 1 points 42 minutes ago

Its funny because this is an admission that hes not actually done anything interesting which is a complete pivot from the last two years of him screaming about how good AI is for the last 2 years.

[–] ViatorOmnium@piefed.social 4 points 1 hour ago

What does Capitalism™ say about "innovations" that can't deliver results? Filtering out crap that only works on some bullshit paper is the one thing capitalism is supposed to be good at.

Yeah, cause its totally not end-stage capitalism to invest a trillion dollars into something and THEN figure out what its for.

[–] Sam_Bass@lemmy.world 24 points 3 hours ago (2 children)

Best use for AI is CEO replacement

[–] Jankatarch@lemmy.world 2 points 1 hour ago* (last edited 1 hour ago)

One small step to the actual solution of "have all the employees vote a CEO every 4 years."

[–] Newsteinleo@infosec.pub 4 points 2 hours ago (2 children)

The problem with this is the savings will go to share holders not workers.

[–] frog_brawler@lemmy.world 2 points 1 hour ago

Next step is to make sure only the workers are shareholders.

[–] HalfSalesman@lemmy.world 3 points 2 hours ago

The savings will also lead to the corporation's profits to decline in the mid term, and then the savings will actually go to private equity and hedge funds, not the shareholders.

[–] Someone8765210932@lemmy.world 1 points 1 hour ago

You can tell how useful AI by how much billionaire are investing in text, audio and video slop generation.

Isn't there now some social media app that only consists of generated ai slop videos?

You'd think their focus should be a little different...

[–] NutWrench@lemmy.world 73 points 4 hours ago* (last edited 4 hours ago) (4 children)

The whole point of "AI" is to take humans OUT of the equation, so the rich don't have to employ us and pay us. Why would we want to be a part of THAT?

AI data centers are also sucking up all the high quality GDDR5 ram on the market, making everything that relies on that ram ridiculously expensive. I can't wait for this fad to be over.

[–] Wildmimic@anarchist.nexus 5 points 2 hours ago

I have to agree, even if i have no issue with GenAI itself. No one needs that many datacenters as they are planning. Adoption will crash as soon as they try monetizing it for real. Even if they try using cloud gaming as a load in those centers - not one person i know would trade their local PC for something that's dependent on a fast internet connection without data caps and introduces permanent 100ms+ delay on all games.

I swapped my 3070Ti 8GB to a 5070 16GB, if i sell off the 3070TI the upgrade cost me 300€ (but i tend to keep it as a backup), and I can run my local GenAI and LLM without issues now, I don't need datacenters, i need CDNs so i can get my content i run locally. and TBH if they really try to kill local compute in gaming: i have enough games here to last me for a decade or more without getting bored, and i can play all of that while sitting in a mountain cabin.

[–] danielton1@lemmy.world 29 points 4 hours ago (1 children)

Not to mention the water depletion and electricity costs that the people who live near AI data centers have to deal with, because tech companies can't be expected to be responsible for their own usage.

load more comments (1 replies)
[–] Poem_for_your_sprog@lemmy.world 16 points 4 hours ago (3 children)

I'd love to take humans out of the equation of all work possible. The problem is how the fascist rulers will treat the now unemployed population.

load more comments (3 replies)
load more comments (1 replies)
[–] h3ron@lemmy.zip 6 points 2 hours ago

when they asked permissions?

[–] SeeMarkFly@lemmy.ml 145 points 5 hours ago (6 children)

So...he has something USELESS and he wants everybody to FIND a use for it before HE goes broke?

I'll get right on it.

load more comments (6 replies)
[–] mariegriffiths@lemmy.cafe 1 points 1 hour ago

I am for AI but decentralized and owned by the people. Not monopolized and owned by the 1%

[–] FireWire400@lemmy.world 16 points 3 hours ago* (last edited 3 hours ago)

Dude, you never had "social permission" to do this in the first place, none of us asked for this shit. You're literally destroying the planet, the economy and our future for your personal gain.

You useless waste of space.

load more comments
view more: next ›