this post was submitted on 28 Apr 2026
77 points (97.5% liked)

Selfhosted

56957 readers
499 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I've been thinking about adding this to my "Fuck it, I'll do it myself" / SHTF pile. I have a spare 10-15GB for a good selection of basic articles (across sciences, history, pop culture trivia etc).

https://get.kiwix.org/en/solutions/hotspots/content-bundles/

https://get.kiwix.org/en/solutions/hotspots/imager-service/

There's something inherently cool about having wikipedia in a box (yes, you'd likely need to refresh it once a year) but I've never heard of anyone actually self hosting a Kiwix instance.

you are viewing a single comment's thread
view the rest of the comments
[–] domi@lemmy.secnd.me 3 points 9 hours ago (2 children)

Do you actually train the LLM or use RAG? I have been looking for a local LLM + Wikipedia RAG solution for a while now.

For now I just have kiwix-serve + searxng doing a simple search but the Kiwix search is...questionable.

[–] surfrock66@lemmy.world 1 points 4 hours ago

So this is actively in progress, and right now I'm having trouble getting my tesla P4's working in my proxmox environment. The P4 is supported for vgpu out of the box, allegedly, but the installer I used is forcing a kernel version pin which isn't making me happy:

https://github.com/anomixer/proxmox-vgpu-installer/issues/16

So at this time, I'm just connecting API's.

[–] SuspciousCarrot78@lemmy.world 2 points 7 hours ago* (last edited 6 hours ago)

Somewhere in my documents, I have a scoped ticket for how to use kiwix as the source for the LLM to pull information directly from, populate its answer organically, and naturally respond to question at hand, without word-vomiting a wiki entry complete. The last I looked, you can poll the kiwix DB directly without using the search engine.

I can dig that up for you if it still exists; it's actually why I'm looking at kiwix (back burner project for now but the spirit moved me).

PS: You're aware of LLM-wiki? That might suit your purposes better, if your corpus is bespoke and updating. Works nicely.

https://tinyurl.com/llmwiki