this post was submitted on 26 Nov 2025
403 points (96.8% liked)

Selfhosted

53222 readers
1539 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Got a warning for my blog going over 100GB in bandwidth this month... which sounded incredibly unusual. My blog is text and a couple images and I haven't posted anything to it in ages... like how would that even be possible?

Turns out it's possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for 'Unknown robot'? This is actually bonkers.

Edit: As Thunraz points out below, there's a footnote that reads "Numbers after + are successful hits on 'robots.txt' files" and not scientific notation.

Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That's when my account was suspended for exceeding bandwidth (it's an artificial limit I put on there awhile back and forgot about...) that's also why the 'last visit' for all the bots is November 12th.

you are viewing a single comment's thread
view the rest of the comments
[–] slazer2au@lemmy.world 103 points 3 days ago (1 children)

AI scrapers are the new internet DDoS.

Might want to throw something Infront of your blog to ward them off like Anubis or a Tarpit.

[–] Eyekaytee@aussie.zone 33 points 3 days ago* (last edited 3 days ago) (2 children)

the one with the quadrillion hits is this bad boy: https://www.babbar.tech/crawler

Babbar.tech is operating a crawler service named Barkrowler which fuels and update our graph representation of the world wide web. This database and all the metrics we compute with are used to provide a set of online marketing and referencing tools for the SEO community.

[–] derpgon@programming.dev 1 points 1 day ago

Metrics on what - how much beating can a server take before it commits ritual Sudoku and fries itself?

[–] porcoesphino@mander.xyz 8 points 2 days ago (1 children)
[–] Jessica@discuss.tchncs.de 11 points 2 days ago (1 children)

It's a quote from the website

[–] Vorpal@programming.dev 11 points 2 days ago (2 children)

It is common custom to indicate quotes, with either "quotes" or for a longer quote a

block quote

The latter can be done by prefixing the line with a > here on lemmy (uses the common markdown syntax).

Doing either of this help avoid ambiguity.

[–] Jessica@discuss.tchncs.de 5 points 2 days ago

You replied to the wrong person. I already know this, but clearly the person who posted the quote doesn't ;)

[–] porcoesphino@mander.xyz 5 points 2 days ago

Thanks the taking the time. I always find it hard to follow up and point out the ambiguity / alternative without coming across in some unwelcome way