this post was submitted on 28 Sep 2025
56 points (98.3% liked)

Selfhosted

51841 readers
581 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

When I first got into self hosting, I originally wanted to join the Fediverse by hosting my own instance. After realizing I am not that committed to that idea, I went into a simpler direction.

Originally I was using Cloudflare's tunnel service. Watching the logs, I would get traffic from random corporations and places.

Being uncomfortable with Cloudflare after pivoting away from social media, I learned how to secure my device myself and started using an uncommon port with a reverse proxy. My logs now only ever show activity when I am connecting to my own site.

Which is what lead me to this question.

What do bots and scrapers look for when they come to a site? Do they mainly target known ports like 80 or 22 for insecurities? Do they ever scan other ports looking for other common services that may be insecure? Is it even worth their time scanning for open ports?

Seeing as I am tiny and obscure, I most likely won't need to do much research into protecting myself from such threats but I am still curious about the threats that bots pose to other self-hosters or larger platforms.

you are viewing a single comment's thread
view the rest of the comments
[–] derek@infosec.pub 1 points 15 hours ago (1 children)

Absolutely. VMs and Containers are the wise sysadmin's friends. Instead of rolling my own ip blocker I use Fail2Ban on public-facing machines. It's invaluable.

[–] DarkAri@lemmy.blahaj.zone 1 points 51 minutes ago

Cool, I have some ideas as well, like maybe write a script that hashes configuration files that needs a secret password to put into edit mode, if the config changes without being out into edit mode first, disconnect the server. Maybe use a raspberry pi that's hidden from the network to do this. I know that wouldn't work for large websites maybe because they can't afford to go down for hours at a time, but it would give you an additional layer of security for sensitive stuff. I'm more into game programming but I know how exploits work and stuff. I'm pretty sure many types of things like this already exist in the market. One idea I had was pretty neat. Basically in your eula you reserve the right to hack back people that try to hack you, and you have an automated system that uses some known exploits to get a ping or maybe install a rootkit on anyone who is trying to mess around in your system. Later you can just get on and deanonymize them. This requires you actually spend time researching your own zero days. People in defcon hacking competitions do this. They are sort of masters with decompilers and hex editors.