this post was submitted on 26 Nov 2025
403 points (96.8% liked)

Selfhosted

53222 readers
1549 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Got a warning for my blog going over 100GB in bandwidth this month... which sounded incredibly unusual. My blog is text and a couple images and I haven't posted anything to it in ages... like how would that even be possible?

Turns out it's possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for 'Unknown robot'? This is actually bonkers.

Edit: As Thunraz points out below, there's a footnote that reads "Numbers after + are successful hits on 'robots.txt' files" and not scientific notation.

Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That's when my account was suspended for exceeding bandwidth (it's an artificial limit I put on there awhile back and forgot about...) that's also why the 'last visit' for all the bots is November 12th.

you are viewing a single comment's thread
view the rest of the comments
[–] deffard@lemmy.world 1 points 1 day ago (1 children)

You just solve it as per the blog post, because it's trivial to solve, as your browser is literally doing so in a slow language on a potentially slow CPU. It's only solving 5 digits of the hash by default.

If a phone running JavaScript in the browser has to be able to solve it you can't just crank up the complexity. Real humans will only wait tens of seconds, if that, before giving up.

[–] SlurpingPus@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

This here is the implementation of sha256 in the slow language JavaScript:

const msgUint8 = new TextEncoder().encode(message);
const hashBuffer = await window.crypto.subtle.digest("SHA-256", msgUint8);
const hashHex = new Uint8Array(hashBuffer).toHex();

You imagined that JS had to have that done from scratch, with sticks and mud? Every OS has cryptographic facilities, and every major browser supplies an API to that.

As for using it to filter out bots, Anubis does in fact get it a bit wrong. You have to incur this cost at every webpage hit, not once a week. So you can't just put Anubis in front of the site, you need to have the JS on every page, and if the challenge is not solved until the next hit, then you pop up the full page saying ‘nuh-uh’, and probably make the browser do a harder challenge and also check a bunch of heuristics like go-away does.

It's still debatable whether it will stop bots who would just have to crank sha256 24/7 in between page downloads, but it does add cost that bot owners have to eat.