this post was submitted on 10 May 2026
143 points (91.3% liked)

Technology

84603 readers
4532 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Both Ubuntu and Fedora have made it official: support is coming soon for running local generative AI instances.

An epic and still-growing thread in the Fedora forums states one of the goals for the next version: the Fedora AI Developer Desktop Objective. It is causing some discontent, and at least one Fedora contributor, SUSE’s Fernando Mancera, has resigned.

top 50 comments
sorted by: hot top controversial new old
[–] wampus@lemmy.ca 1 points 1 day ago

I just recently had an issue with Access, where running a report multiple times would randomly result in errors, or clean reports. The explanation copilot gave was that Access has a built in LLM now that is "interpreting" certain queries, and may result in different outputs, even different resulting data types. So doing a join on those outputs for the report, would sometimes result in reporting errors, even without changing the underlying logic. AI has sorta taken a calculator that used to reliably return 2 + 2 = 4, and turned it into one that sometimes thinks 2 + 2 = chair.

I don't know how comfy I am seeing AI integrated everywhere at this stage of the tech. It really doesn't feel mature enough for production.

[–] HubertManne@piefed.social 3 points 1 day ago

Im actually fine with this but its got to be gpl all the way and it should be like runlevels. install console only, install graphhical, install llm supported.

[–] just_another_person@lemmy.world 92 points 3 days ago* (last edited 3 days ago) (1 children)

They're discussing a new spin specifically for AI developers, not changing existing distros to include new stuff.

CTFD

[–] Lost_My_Mind@lemmy.world 40 points 3 days ago (5 children)

Oh good. We can all just ignore the AI version then?

[–] FauxLiving@lemmy.world 11 points 3 days ago

It's Linux, you always could ignore any software that you don't want

[–] garbage_world@lemmy.world 12 points 3 days ago (1 children)

No we can't, we must be mad, because "AI"

[–] FauxLiving@lemmy.world 7 points 3 days ago

This guy social medias

[–] Smoogs@lemmy.world 4 points 3 days ago

Right. It is Linux after all. Not microslop

[–] corsicanguppy@lemmy.ca 6 points 3 days ago

Well, we get to slam it and heckle its users, because, well, we need SOME joy in The world...

[–] plz1@lemmy.world 10 points 2 days ago

FWIW, LM Studio makes it incredibly easy to do this. I've been in tech for decades, and there are probably only a couple of suggestions I"d made to the LMS team if they wanted to target a broader, less tech savvy user base, but I think they already have their target demographic covered. I imagine the Ubuntu and Fedora crowds are already tech savvy, but vendors making it easier to ween reliance off tech giants' LLMs isn't a bad thing, if LLM's are here to stay.

Now the one thing that will turn me off to initiatives like this is if these OS vendors restrct which model can be used, or make it more friction not to use their "chosen" default. Like Google just did by pushing What I"m assuming was Gemma 4 E2B to Chrome users. I figure Google wants to offload the LLM usage to local compute to take the load of their data centers, and since Chrome is already a data harvesting tool for them, there was no downside to their operations.

[–] FauxLiving@lemmy.world 35 points 3 days ago (2 children)

-The system image will not be pre-configured with applications that inspect or monitor how users interact with the system or otherwise place user privacy at risk.

-Tools and applications included in the AI Desktop will not be pre-configured to connect to remote AI services.

-AI tools will not be added to Fedora’s existing system images, Editions, etc, by the AI Desktop initiative.

  1. It's a new system image, your Fedora and Ubuntu installs will not be modified.
  2. The applications that would have privacy implications will be opt-in by default.
  3. It will not use remote AI unless you configure it to.

I don't see the problem.

[–] potustheplant@feddit.nl 10 points 3 days ago (1 children)

Of using resources to implement a "feature" that users have already expressed they don't want? Weird, I do.

[–] FauxLiving@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

The users would be the people who choose to install the new system image. The system image which will be created by a group of people who volunteer to create it.

If you're not using the new system image then this doesn't affect you at all.

The only opinions that matter here are the ones of the people who are choosing to donate their time to the project. If you think that more development time should be allocated to one project over another then you are free to volunteer your time in order to make that happen.

You are not free to volunteer the time of other people, however.

[–] PotatoPie@lemmy.zip 4 points 2 days ago (1 children)

Add opt in commercials as well that pay the users in AI tokens, and opt in system for selling data on your OS to Palantir that pays user in stocks of Israeli arms dealers, give the users freedom of choice, no problem still \s

[–] FauxLiving@lemmy.world 7 points 2 days ago

/s

Oh, I like this game.

You're right, if you take something actually happening in reality and append a scary fabricated scenario then it sound scary.

What an interesting new discovery, I hope nobody tries to apply this in reality. Completely unrelated, Have you considered a career in politics?

I apologize to the community for lacking the paranoid delusions required to see how a project staffed by volunteers who are creating a system image which users have to choose to install was actually part of a plot by the Illuminati to force people to sell their soul to the Antichrist.

As a unwitting tool of The Man, I throw myself at the mercy of the community's judgement. Please scourge me most thoroughly so that I may repent from my evil ways of observing reality as it appears.

[–] mrmaplebar@fedia.io 30 points 3 days ago (1 children)

Always pretty funny to see the FOSS community willingly follow the lead of corporate investment trends, but for fractions of pennies on the dollar compared to big tech.

If you're gonna act like Microsoft you might as well do it for the money.

[–] W3dd1e@lemmy.zip 3 points 2 days ago

To be fair, those two distros are owned by big corporations so it’s not unexpected.

[–] JoeKrogan@lemmy.world 39 points 3 days ago (3 children)

I think it is good to have optional support for local models that lets people use them in an offline and private and easy way. There is a lot of non technical folks using linux nowadays and many chose it for privacy and greater control over their data.

Depending on the implementation it could hook into certain os contexts and events to actually be helpful.

Either way I don't see the cat going back in the bag with regards to LLMs. That being said I run Debian everywhere except my work machine which is ubuntu.

Preemptive compliance to the tech fascism of the US?

What a ridiculous and spineless way to live.

[–] badgermurphy@lemmy.world 18 points 3 days ago* (last edited 3 days ago) (1 children)

In this case, the "bag" is a sucking black hole, and venture capitalists are throwing physics-defying amounts of money in it to drag the LLMs out. As soon as they stop that, the "cat" goes back in the "bag".

Local LLM models are an exception, but they are also atrocious by comparison. Most users will get some limited utility from an LLM if they had one, maybe, but it is being accommodated and foisted everywhere like its the invention of the mouse. It is nowhere near as paradigm shifting, but is being hyped, advertised, and marketed more aggressively than any product in history. So, the roaring hype makes everyone think that if they don't get on board too, they'll be left in the dust, so now well-meaning projects are getting bloated up for it too.

Many of us just want this technology to get the fuck away from us until it is worth using or dies already. Is that so very much to ask?

[–] CheerfulPassionFruit@lemmy.world 7 points 3 days ago (1 children)

Using an llm mondel that isn't super advanced is actually quite freeing in my opinion, the generated output is always mediocre at best, but it's usually good enough for boilerplate and can be decent if you need to unstuck yourself. It also isn't good enough to lull you into just letting the llm do all the work for you since it makes obvious mistakes.

[–] badgermurphy@lemmy.world 11 points 3 days ago (1 children)

"It's just good enough for some things once in a while, but is too bad to rely on in any serious way," doesn't sound like a great use of my electricity, but I guess I've wasted electricity on less. Still, doing it on purpose seems worse.

[–] luciferofastora@feddit.org 2 points 2 days ago (5 children)

I mean, it sounds like a tool they occasionally find useful and don't use otherwise. I'm not sure how "occasionally use a tool good enough for my purposes" is a waste. Whether it's the most efficient application of that electricity is a different question, but without knowing their particular scenarios I can't really compare whether other tools use less electricity for the same purpose.

(Yes, of course, "just do it all in your brain" is even more efficient, but if that's an argument against utilities, you probably shouldn't waste electricity on Lemmy either)

load more comments (5 replies)
load more comments (1 replies)
[–] anon_8675309@lemmy.world 27 points 3 days ago

What’s wrong with just installing ollama?

[–] ejs@piefed.social 28 points 3 days ago (2 children)

lol they already support running local models. wtf is the distro gonna do…? pre-install llama.cpp? this is so silly to me that people are resigning over this, too.

[–] Gork@sopuli.xyz 17 points 3 days ago (1 children)

Is that the one that really whips the llama.cpp's ass?

[–] luthis@lemmy.nz 10 points 3 days ago (3 children)

Now I have to install winamp and get a bunch of sick skins for it.

load more comments (3 replies)
load more comments (1 replies)

I mean if it's completely optional and opt-in, sure go for it. Knock yourself out. I won't be using it (my computer isn't powerful enough to run local LLMs)

[–] its_kim_love@lemmy.blahaj.zone 16 points 3 days ago (1 children)
load more comments (1 replies)
[–] mokey@therock.fraggle-rock.org 6 points 3 days ago (6 children)

FOMO is a thing. Who’s gonna want to run the old version packaged in the distro anyway. Those things go stale pretty quickly, especially at the rate we’re seeing updates in local inference.

load more comments (6 replies)
[–] woelkchen@lemmy.world 7 points 3 days ago (2 children)

SUSE contributes to Fedora? WTF?

[–] Goodlucksil@lemmy.dbzer0.com 5 points 3 days ago

Someone from SUSE used to contribute to Fedora

load more comments (1 replies)
[–] RiverRabbits@lemmy.blahaj.zone 2 points 2 days ago (1 children)

This isn't a "ghost in the machine". This is "introducing rot directly into the core of the machine".

Time to boycott all ubuntu and fedora derivatives and distributions. Fuck AI!

[–] unphazed@lemmy.world 1 points 2 days ago

I just got comfortable with Bazzite, too. Cept the wrird OOM that popped up with Firefox, still attempting to fix that one...

[–] magnue@lemmy.world 6 points 3 days ago (3 children)

I don't really understand the point? You can already do a lot on Linux using AI via CLI with bash.

load more comments (3 replies)
load more comments
view more: next ›