this post was submitted on 25 Apr 2026
44 points (84.4% liked)

Selfhosted

56957 readers
490 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I'm looking to build a low-end ollama LLM server to improve home assistant voice control, Immich image recognition and a few other services. With the current cost of hardware components like memory, I'm looking to build something small, but somewhat expandable.

I have an old micro-atx form factor computer that I'm thinking will be a good option to upgrade. I'd love recommendations on motherboards, processors, and video card combos that would likely be compatible and sufficient to run a decent server while keeping costs lower, basically, the best bang for the buck. I have a couple of M.2 SSDs I can re-purpose. Would prefer the motherboard has 2.5Gbit Ethernet, but otherwise I'm open.

Also recommendations on sites to purchase good quality memory at reasonable prices that ship to the US. I'd be willing to look at lightly used components, too.

Any advice on any of these topics would be greatly appreciated. The advice I've found has all been out of date especially with crypto fading so video cards are not as expensive, but LLM data centers eating up and reserving memory before it's even manufactured.

you are viewing a single comment's thread
view the rest of the comments
[–] clifmo@programming.dev 2 points 2 days ago (1 children)

Until you tell us what your budget is, I'm not sure there's much to discuss. You're talking about motherboards. So I guess your choice right now comes down to Strix halo or not?

[–] irotsoma@piefed.blahaj.zone 1 points 2 days ago (1 children)

Definitely not needing something that high-end. It's just me and maybe one other person using it periodically for voice commands that needs to be realtime. The rest is background processed stuff like Immich image recognition and Jellyfin audio/video processing. Nothing fancy is needed. I mention motherboard because the system I'm thinking of using is currently running Plex which I'm in the process of replacing with Jellyfin on my Kubernetes cluster of minipcs and Raspberry pis that runs most of my stuff pretty well, but could benefit from dedicated LLM/ML. So, that machine will be freed up, but it's nearly a decade old and not up to the task as it is.

As for specific budget, I don't have specifics in mind. My Kubernetes cluster is super energy efficient since it's all small systems that only spin up when needed. So thinking about overall cost of ownership vs benefit. Having something too high end would just waste energy as well as the initial investment.

[–] radieschen@slrpnk.net 1 points 1 day ago

You could consider something from the Radxa Rock 5 series.