this post was submitted on 08 Jan 2026
427 points (99.3% liked)

Technology

78511 readers
3884 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 19 comments
sorted by: hot top controversial new old
[–] TheFeatureCreature@lemmy.ca 135 points 3 days ago (1 children)

Goddamn Dell of all companies is the only one that seems to get it. I'm shocked.

Welcome to the world outside of your own ass, Dell. The air is fresher out here.

[–] shifty@leminal.space 38 points 3 days ago

I want a new shiny thinkpad but I don't want the stupid copilot button that I'll need to reprogram when I install linux.

[–] bookmeat@lemmynsfw.com 17 points 2 days ago
[–] wuffah@lemmy.world 60 points 3 days ago* (last edited 3 days ago) (2 children)

The TPU is not for you, it’s for tech companies to surveil, train, and exfiltrate your data at every level of the stack from hardware to application in a “secure” fashion. The “features” they proscribe feel forced because they are - they exist to make it seem like you’re getting the next new thing, when really the features that you actually want (like an ad-free experience and a local Windows account) have been so enshitified to hell that they functionally no longer exist.

  • The hardware manufacturer gets to sell you extra hardware you don’t want.

  • The OS vendor gets to run classification on all of your pictures, documents, and software.

  • The browser vendor gets to see every site you visit and when, every product you buy, and every porn site you watch.

  • Then, it all gets trained into a personal model that is cryptographically tied to your hardware and identity that can be queried at any time by tech companies, data brokers, advertisers, and law enforcement.

What YOU get is to ask a finders-fee-biased chatbot which product to buy, maybe some shitty mental health care, and a sycophantic buddy to patronize your ego. AI is the ultimate surveillance and advertising device.

[–] MagicShel@lemmy.zip 16 points 3 days ago

Yeah, any AI with that much visibility into my life needs to be a locally run and personally controlled AI.

But frankly as much as I might like that for myself, I don't want it because then it'll be baked into work computers with the same set of circumstances except now you have to placate an AI for career advancement.

On the other hand, I just had an amazing idea for a n AI-powered USB device which emulates a keyboard but just does random SRS BIZNESS tasks like 16 hours a day. It'll find articles on the internet and graph all the numbers (even page numbers) in a spreadsheet. It'll create PowerPoints out of YouTube videos. It'll draft marketing materials and email them to random internet addresses. You'll be president of the company by the end of the month if AI has anything to say about it!

[–] CeeBee_Eh@lemmy.world 3 points 2 days ago

The OS vendor gets to run classification on all of your pictures, documents, and software.

Not on Linux

[–] melfie@lemy.lol 10 points 2 days ago* (last edited 2 days ago) (1 children)

On a related note, TIL AMD has an evolving competitor to Apple Silicon, except they named it “Ryzen AI” so everyone could ignore it. They apparently announced a couple new chips this week, and if it weren’t for a cgchannel article mentioning it in connection with Blender, video editing, etc., it never would’ve registered in my brain that the SoCs have GPU cores as well and might be useful for something. Still RDNA 3.x, so meh, but I’m looking forward a future SoC with RDNA 5 that supposedly has drastically improved RTX cores on par with Nvidia to use in a render farm at some point. They’re definitely not going to beat Nvidia if they keep using stupid names for their chips, though. NPU cores are useful for some things, but a SoC with the full gamut is the way to go.

Those apus are top tier in the mini pc world, but cost a pretty penny for sure.

[–] xxce2AAb@feddit.dk 30 points 3 days ago

Thank fucking God.

[–] qyron@sopuli.xyz 20 points 3 days ago

And so it begins?

[–] umbrella@lemmy.ml 6 points 2 days ago

just in time for ram prices to explode

[–] Xerxos@lemmy.ml 5 points 2 days ago (1 children)

These AI PCs/Notebooks aren't even able to run the really big models. For those you'd need a 5000$ card. And for the smaller models a good graphics card is often enough.

For a real AI PC we would need new technology or graphics cards with more ram (the last one would suprise me with current ram prices)

[–] SuspciousCarrot78@lemmy.world 2 points 2 days ago* (last edited 2 days ago)

I'm still sanguine that 1.58 BITNET models take off. Those could plausibly run at good clip on existing CPUs, no GPU needed.

Super basic medium article for those not in know

https://medium.com/@kondwani0099/reimagining-ai-efficiency-a-practical-guide-to-using-bitnets-1-bit-llm-on-cpus-without-ef804d3fb875

~~Necessity~~ spite is usually a good driver...though given BITNET is Microsoft IP....ehh...I won't hold my breath for too long. Still waiting for their 70B model to drop... maybe this year....

[–] sirico@feddit.uk 11 points 3 days ago

Enterprise has spoken

[–] Cromer4ever@lemmy.world 3 points 2 days ago

Almost nobody takes advantage of all the AI features they offer, Dell realized this

[–] RobotToaster@mander.xyz 1 points 3 days ago (3 children)

Do NPUs/TPUs even work with ComfyUI? That's the only "AI PC" I'm interested in.

[–] L_Acacia@lemmy.ml 2 points 2 days ago

The support is bad for custom nodes and NPUs are fairly slow compared to GPUs (expect 5x to 10x longer generation time compared to 30xx+ GPUs in best case scenarios) NPUs are good at running small models efficiently, not large LLM / Image models.

[–] SuspciousCarrot78@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

NPUs yes, TPUs no (or not yet). Rumour has it that Hailo is meant to be releasing a plug in NPU "soon" that accelerates LLM.