this post was submitted on 21 Jan 2026
656 points (98.5% liked)

Technology

78923 readers
3422 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Workers should learn AI skills and companies should use it because it's a "cognitive amplifier," claims Satya Nadella.

in other words please help us, use our AI

top 50 comments
sorted by: hot top controversial new old
[–] DrCake@lemmy.world 7 points 32 minutes ago (2 children)

AI industry needs to encourage job seekers to pick up AI skills (undefined), in the same way people master Excel to make themselves more employable.

Has anyone in the last 15 years willingly learned excel? It seems like one of those things you have to learn on the job as your boomer managers insist on using it.

[–] Buddahriffic@lemmy.world 2 points 8 minutes ago

Funny thing about "AI skills" that I've noticed so far is that they are actually just skills in the thing you're trying to get AI to help with. If you're good at that, you can often (though not always) get an effective result. Mostly because you can talk about it at a deeper level and catch mistakes the AI makes.

If you have no idea about the thing, it might look competent to you, but you just won't be catching the mistakes.

In that context, I would call them thought amplifiers and pretty effective at the whole "talking about something can help debug the problem, even if the other person doesn't contribute anything of value because you have to look at the problem differently to explain it and that different perspective might make the solution more visible", while also being able to contribute some valueable pieces.

[–] jj4211@lemmy.world 1 points 11 minutes ago

Yeah, very good analogy actually...

I remember back in the day people putting stuff like 'Microsoft Word' under 'skills'. Instead of thinking 'oh good, they will be able to use Word competently', the impression was 'my god, they think Word is a skill worth bragging about, I'm inclined to believe they have no useful skills'.

'Excel skills' on a resume is just so vague, people put it down when they just figured out they can click and put things into a table, some people will be able to quickly roll some complicated formula, which is at least more of a skill (I'd rather program a normal way than try to wrangle some of the abominations I've seen in excel sheets).

Using an LLM is not a skill with a significant acquisition cost. To the extent that it does or does not work, it doesn't really need learning. If anything people who overthink the 'skill' of writing a prompt just end up with stupid superstitions that don't work, and when they first find out that it doesn't work, they just grow new prompt superstitions to add to it to 'fix' the problem.

[–] OshagHennessey@lemmy.world 29 points 1 hour ago (1 children)

"Microsoft thinks it has social permission to burn the planet for profit" is all I'm hearing.

[–] P1k1e@lemmy.world 3 points 24 minutes ago

Well, they at least have investor permission...which is the only people they care about anyway

[–] matlag@sh.itjust.works 5 points 44 minutes ago* (last edited 43 minutes ago)

Take away:

  1. MS is well aware AI is useless.
  2. Nadella admits they invested G$ in something without having the slightest clue what its use-cas would be ("something something rEpLaCe HuMaNs")
  3. Nadella is blissfully unaware of the "social" image MS already has in the eye of the public. You don't have our social permission to still live as a company!
[–] HertzDentalBar@lemmy.blahaj.zone 1 points 4 minutes ago

Well you already lost that or rather never actually had that. You all pushed a broken and incomplete product you need to find a use not us...

[–] SparroHawc@lemmy.zip 2 points 29 minutes ago

"We have to find a compelling use case so we can keep tragedying the commons!"

[–] LifeLikeLady@lemmy.world 1 points 18 minutes ago

Dear CEOs. I have revoked my permission. In fact it was never given.

[–] FreddiesLantern@leminal.space 64 points 3 hours ago (3 children)

How can you lose social permission that you never had in the first place?

[–] JoeBigelow@lemmy.ca 22 points 2 hours ago (3 children)

The peasants might light their torches

[–] Xylian@lemmy.world 1 points 5 minutes ago

"Torching" the gas turbines what are on AI companies datacenters would be highly effective. Especially since they are outside and only a fence protects them.

It is so dump what they gas our environment for "AI". It was evil doing it in WW1 and WW2 and it is still today. See:

It is insane.

[–] DarkFuture@lemmy.world 3 points 1 hour ago

This guy knows how to translate billionaire dipshit speak.

[–] tempest@lemmy.ca 8 points 2 hours ago (1 children)

Datacenters are expensive and soft targets.

[–] Knock_Knock_Lemmy_In@lemmy.world 2 points 1 hour ago (2 children)

Dude, building are pretty hard.

[–] tempest@lemmy.ca 1 points 6 minutes ago

Yeah but it's really easy to hurt their feelings so be mindful

[–] demonsword@lemmy.world 1 points 59 minutes ago (1 children)

not OP but I believe they're "soft" in the sense that they don't have moats/high electric fences/battalions of armed guards around 24/7

[–] JoeBigelow@lemmy.ca 2 points 43 minutes ago (1 children)

With a clipboard you could probably just walk in and start unplugging things

[–] tempest@lemmy.ca 1 points 2 minutes ago

That's... Not quite true. Usually they take access quite seriously. If in a multi tenant space every space will be separated and the physical cages around the machines locked and monitored.

All the same they are designed to keep small numbers of mostly law abiding people out, not an angry mob with torches.

[–] rumba@lemmy.zip 9 points 2 hours ago

There's a latency between asking for forgiveness and being demanded to stop.

[–] lando55@lemmy.zip 4 points 2 hours ago

It's easier to beg for social forgiveness than it is to ask for social permission

[–] Photonic@lemmy.world 2 points 1 hour ago (1 children)

Do something useful

What do you mean, that using ChatGPT for a recipe for eggs, sunny side up without any seasoning or toppings and burning up the electricity of a moderate household for a week with my query isn’t useful?

[–] SparroHawc@lemmy.zip 2 points 24 minutes ago

It's not the query that burns through electricity like crazy, it's training the models.

You can run a query yourself at home with a desktop computer, as long as it has enough RAM and compute cells to support the model you're using (think a few high-end GPUs).

Training a model requires a huge pile of computer power though, and the AI companies are constantly scraping the internet to ~~steal~~find more training material

[–] kameecoding@lemmy.world 25 points 3 hours ago (4 children)

I will try to have a balanced take here:

The positives:

  • there are some uses for this "AI"
  • like an IDE it can help speed up the process of development especially for menial tasks that are important such as unit test coverage.
  • it can be useful to reword things to match the corpo slang that will make you puke if you need to use it.
  • it is useful as a sort of better google, like for things that are documented but reading the documentation makes your head hurt so you can ask it to dumb it down to get the core concept and go from there

The negatives

  • the positives don't justify the environmental externalities of all these AI companies
  • the positives don't justify the pc hardware/silicone price hikes
  • shoehorning this into everything is capital R retarded.
  • AI is a fucking bubble keeping the Us economy inflated instead of letting it crash like it should have a while ago
  • other than a paid product like copilot there is simply very little commercially viable use-case for all this public cloud infrastructure other than targeting with you more ads, that you can't block because it's in the text output of it.

Overall I wish the AI bubble burst already

[–] ViatorOmnium@piefed.social 24 points 3 hours ago (2 children)

menial tasks that are important such as unit test coverage

This is one of the cases where AI is worse. LLMs will generate the tests based on how the code works and not how it is supposed to work. Granted lots of mediocre engineers also use the "freeze the results" method for meaningless test coverage, but at least human beings have ability to reflect on what the hell they are doing at some point.

[–] Buddahriffic@lemmy.world 1 points 3 minutes ago

You could have it write unit tests as black box tests, where you only give it access to the function signature. Though even then, it still needs to understand what the test results should be, which will vary from case to case.

load more comments (1 replies)
[–] arendjr@programming.dev 4 points 2 hours ago

So I’m the literal author of the Philosophy of Balance, and I don’t see any reason why LLMs are deserving of a balanced take.

This is how the Philosophy of Balance works: We should strive…

  • for balance within ourselves
  • for balance with those around us
  • and ultimately, for balance with Life and the Universe at large

But here’s the thing: LLMs and the technocratic elite funding them are a net negative to humanity and the world at large. Therefore, to strive for a balanced approach towards AI puts you on the wrong side of the battle for humanity, and therefore human history.

Pick a side.

[–] rumba@lemmy.zip 5 points 2 hours ago

They f'd up with electricity rates and hardware price hikes. They were getting away with it by not inconveniencing enough laymen.

load more comments (1 replies)
[–] morto@piefed.social 60 points 4 hours ago (4 children)
  • Denial
  • Anger
  • Bargaining <- They're here
  • Depression
  • Acceptance
[–] matlag@sh.itjust.works 2 points 52 minutes ago

Correct, but needs clarification:
Depression referring to the whole economy as the bubble burst.
Acceptance is when the government accepts to bail them out because they're too big and the gov is too dependent on them to let them die.

[–] rumba@lemmy.zip 18 points 2 hours ago (3 children)

The five stages of corporate grief:

  • lies
  • venture capital
  • marketing
  • circular monetization
  • private equity sale
load more comments (3 replies)
[–] FilthyShrooms@lemmy.world 9 points 2 hours ago

Denial: "AI will be huge and change everything!"

Anger: "noooo stop calling it slop its gonna be great!"

Bargaining: "please use AI, we spent do much money on it!"

Depression: companies losing money and dying (hopefully)

Acceptance: everyone gives up on it (hopefully)

[–] Chais@sh.itjust.works 7 points 3 hours ago* (last edited 2 hours ago) (1 children)

Which seems like good progress. I feel like they were in denial not three weeks ago.

load more comments (1 replies)
[–] DaddleDew@lemmy.world 44 points 4 hours ago* (last edited 4 hours ago)

Translation: Microslop's executives are finally starting to realize that they fucked up.

[–] vane@lemmy.world 1 points 1 hour ago

I'll buy it for 1 dollar.

[–] circuitfarmer@lemmy.sdf.org 22 points 3 hours ago

Textbook definition of a solution searching for a problem.

[–] fartographer@lemmy.world 4 points 2 hours ago

Must do something useful? You're the one selling the damn thing. You can't build a Pinto and then tell people "we have to stop burning to death or we'll lose permission to keep production faulty cars."

There is something inherently wrong with your product, and you can't even fix it because you're too busy shoving it down everyone's throats.

It's like you're trying to bake cookies using pieces of every plagiarized baking recipe, whether or not they're related. Then, before you've actually tasted the cookies, you're telling everyone to reach into the oven and try using this "basic" cookie to modify and make their own cookies.

Except the cookies haven't even baked yet. And before you've ever tasted a single fully baked cookie, you're announcing modifications to your cookie dough recipe based on feedback from your previously undercooked, improperly made cookies.

Go back to small scale. Let people bake their own cookies at home, and report what they've discovered. Try upscaling those recipes, and see if you can make any parts more efficient.

And quit telling people to eat your tainted cookies that are poisoning everyone, and then telling them that if they don't start enjoying your cookies soon, then you're gonna have to shut down your factory.

Your cookie/Pinto/AI venture deserves to be shut down. Take the L, learn from it, and try again after you figure out how to get it right. Bake a better cookie instead of trying to make better consumers.

[–] 1984@lemmy.today 3 points 2 hours ago

They have nothing consumers want.

[–] ViatorOmnium@piefed.social 5 points 3 hours ago

What does Capitalism™ say about "innovations" that can't deliver results? Filtering out crap that only works on some bullshit paper is the one thing capitalism is supposed to be good at.

[–] Sam_Bass@lemmy.world 24 points 4 hours ago (4 children)

Best use for AI is CEO replacement

[–] Jankatarch@lemmy.world 2 points 2 hours ago* (last edited 2 hours ago)

One small step to the actual solution of "have all the employees vote a CEO every 4 years."

load more comments (3 replies)
[–] NutWrench@lemmy.world 73 points 6 hours ago* (last edited 6 hours ago) (8 children)

The whole point of "AI" is to take humans OUT of the equation, so the rich don't have to employ us and pay us. Why would we want to be a part of THAT?

AI data centers are also sucking up all the high quality GDDR5 ram on the market, making everything that relies on that ram ridiculously expensive. I can't wait for this fad to be over.

load more comments (8 replies)
[–] UnspecificGravity@piefed.social 4 points 3 hours ago

Yeah, cause its totally not end-stage capitalism to invest a trillion dollars into something and THEN figure out what its for.

[–] SeeMarkFly@lemmy.ml 153 points 7 hours ago (6 children)

So...he has something USELESS and he wants everybody to FIND a use for it before HE goes broke?

I'll get right on it.

load more comments (6 replies)
[–] rizzothesmall@sh.itjust.works 1 points 2 hours ago

I really respected Nadella for pushing developer experience and championing opening up the source of .Net. I was like "There's a guy who knows his business and his audience"

This, tho... Come on dude.

[–] Fizz@lemmy.nz 1 points 2 hours ago

Its funny because this is an admission that hes not actually done anything interesting which is a complete pivot from the last two years of him screaming about how good AI is for the last 2 years.

[–] kescusay@lemmy.world 293 points 8 hours ago (41 children)

"Cognitive amplifier?" Bullshit. It demonstrably makes people who use it stupider and more prone to believing falsehoods.

I'm watching people in my industry (software development) who've bought into this crap forget how to code in real-time while they're producing the shittiest garbage I've laid eyes on as a developer. And students who are using it in school aren't learning, because ChatGPT is doing all their work - badly - for them. The smart ones are avoiding it like the blight on humanity that it is.

load more comments (41 replies)
load more comments
view more: next ›