Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
-
No low-effort posts. This is subjective and will largely be determined by the community member reports.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
Fair question. Use case: you take rough notes during a meeting, no formatting, just raw thoughts. AI can clean them up, summarize, or restructure after the fact. It's completely optional though. Disabled by default, doesn't even show in the context menus unless you explicitly configure it in settings with your own API key. If you don't want it, it's like it doesn't exist.
So, a feature for those who want it, but turned off out of the box for those who absolutely do not want it? Did I understand correctly?
Exactly. Off by default, invisible unless you enable it.
As ai features should be. You're the dev?
Correct. Yes I am.
Cool. I appreciate this design decision. If only more went that route (looking at you, Microslop)
I see on the page it says you can bring an anthropic or openai key. Can I also point it at my own locally hosted model?
Not at this moment. Which local model would you like to see as an additional option?
Ollama, lmstudio, llama.Cpp
I don't know what is typical, but when I use AI locally I've been running llama-cpp with models grabbed from HF (ex. QwenCoder). Then in my VS code plugin (RooCode) I use the "OpenAI compatible" option to point it at my local server.
Not sure how hard that is to get working, but my hope is that "OpenAI Compatible" helps.