Darkassassin07

joined 2 years ago
[–] Darkassassin07@lemmy.ca 1 points 3 weeks ago

That will solve part of the problem, preventing downloads before an item has even released; but there's still lots of potential to grab unwanted torrents and leave the arrs asking for intervention when they can't import it.

Ideally the indexers would be filtering out this junk before users can even grab them, but failing that I think we've got a decent solution. Check out the edited OP

[–] Darkassassin07@lemmy.ca 2 points 3 weeks ago (1 children)

Check out the edited OP.

[–] Darkassassin07@lemmy.ca 2 points 3 weeks ago* (last edited 3 weeks ago)

I'm taking a look at this. It looks like it's the malware blocker portion that I'm interested in, but if I enable it and 'delete known malware', it just complains every minute that there are no blocklists enabled. (though the documents say it's supposed to fetch one from a pages.dev url that has almost no content)

Do you have a specific malware blocklist configured? Enabling the specific service blocklists demands a url for one.

I can host/build a list over time for these to use if that's what I've gotta do; just wondering if there's a public collaboration on one already on the go.

/edit: found it

https://raw.githubusercontent.com/Cleanuparr/Cleanuparr/refs/heads/main/blacklist

[–] Darkassassin07@lemmy.ca 5 points 3 weeks ago (1 children)

That's what I'd already done as per the OP, but it leaves Sonarr/Radarr wanting manual intervention for the 'complete' download that doesn't have any files to import.

[–] Darkassassin07@lemmy.ca 2 points 3 weeks ago

I just did some digging and found I do have some good quality content from them, but they were all grabbed via NZBGeek.

Every torrent I've gotten with that label has been garbage/malware.

[–] Darkassassin07@lemmy.ca 4 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

This comment prompted me to look a little deeper at this. I looked at the history for each show where I've had failed downloads from those groups.

For SuccessfulCrab; any time a release has come from a torrent tracker (I only have free public torrent trackers) it's been garbage. I have however had a number of perfectly fine downloads with that group label, whenever retrieved from NZBgeek. I've narrowed that filter to block the string 'SuccessfulCrab' on all torrent trackers, but allow NBZs. Perhaps there's an impersonator trying to smear them or something, idk.

ELiTE on the other hand, I've only got history of grabbing their torrents and every one of them was trash. That's going to stay blocked everywhere.


The block potentially dangerous setting is interesting, but what exactly is it looking for? The torrent client is already set to not download file types I don't want, so will it recognize and remove torrents that are empty? (everything's marked 'do not download') I'm having a hard time finding documentation for that.

[–] Darkassassin07@lemmy.ca 4 points 3 weeks ago (3 children)

Awesome. Thanks you two, I appreciate the help. :)

[–] Darkassassin07@lemmy.ca 5 points 3 weeks ago

Awesome. Thanks you two, I appreciate the help. :)

[–] Darkassassin07@lemmy.ca 10 points 3 weeks ago (10 children)

Ok, I think I've got this right?

Settings > Profiles > Release Profiles.

Created one, setup 'must not contain' words, indexer 'any', enabled.

That should just apply globally? I'm not seeing anywhere else I've got to enable it in specific series, clients, or indexers.

 

Sonarr and Radarr keep grabbing releases from a couple specific groups ( 'SuccessfulCrab' and 'ELiTE') for items that clearly haven't even aired yet. These almost always contain only .scr or .lnk files, which have been blocklisted in my torrent client. This leaves Sonarr/Radarr awaiting manual intervention for 'complete' downloads that contain no files.

How do I get them to block anything and everything that contain the strings 'SuccessfulCrab' and 'ELiTE' ??? I want them to stop even trying to grab anything released by those two groups.

I'm so sick of dealing with these.


[EDIT]

OK, so I have been looking at this from the wrong angle.

It is not these groups that I'm upset with, but malware uploaders masquerading as release groups. These names can and will change, making this a game of wack-a-mole if I try to fight it this way.

Initially I'd followed instructions below to block these names/strings being grabbed by the arrs and that does work fantastically; but as above, wack-a-mole. Plus both SuccessfulCrab and ELiTE have plenty of good releases out there, it's not their fault someone's using their names.

So, I'm now running Cleanuparr.

This will maintain a large list of unwanted filetypes in qbittorrent. Then when qbit marks a torrent as complete because there are no wanted files, cleanupparr removes it from both qbit and the arr that requested it, while also triggering a new search for the item.

It can also cleanup items failing to import, stalled downloads, torrents stuck downloading metedata, or things that are just absurdly slow; with varying time scales/stringency.

I'll run this for a bit and see how it goes.

[–] Darkassassin07@lemmy.ca 1 points 3 weeks ago

To be perfectly honest, auto updates aren't really necessary; I'm just lazy and like automation. One less thing I've gotta remember to do regularly.

I find it kind of fun to discover and explore new features on my own as they appear. If I need documentation, it's (usually...) there, but I'd rather just explore. There are a few projects where I'm avidly following the forums/git pages so I'm at least aware of certain upcoming features, others update whenever they feel like it and I'll see what's new next time I happen to be messing with them.

Watchtower notifies me whenever it updates something so I've at least got a history log.

[–] Darkassassin07@lemmy.ca 18 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

I've had Immich auto updating alongside around 36 other docker containers for at least a year now. I've very very rarely had issues, and just attach specific version tags to the things that have caused problems. Redis and postgres for example in both Immich and Paperless-NGX have fixed version tags because they take manual work to upgrade the old databases. The main projects though, have always auto updated just fine for me.

The reason I don't really worry about it: Solid backups.

BorgBackup runs in the early AM, shortly before Watchtower updates almost all of my containers, making a backup of the entire system (not including bulk storage) first.

If I was to get up in the morning and find a service isn't responding (Uptime-kuma notifies me via email if it can't reach any container or service), I'll mess with it and try to get the update working (I've only actually had to do this once so far, the rest has updated smoothly). Failing that, I can just extract yesterday's data from the most recent backup and restore a previous version.

Because of Borgs compression and de-duplication, concurrent backups of the same system can be stored in an absurdly small amount of space. I currently have 22 backups of ~532gb each, going back a full year. They are stored in 474gb of disc space. Raw, that'd be 11.8TB

 

I have a pile of part lists for tools I'm maintaining, in pdf format; and I'm looking for a good way to take a part number, search through the collection of pdfs, and output which files contain that number. Essentially letting me match random unknown part numbers to a tool in our fleet.

I'm pretty sure the majority of them are actual text you can select and copy+paste, so searching those shouldn't be too difficult; but I do know there's at least a couple in there that are just a string of jpgs packed in a pdf file. They will probably need OCR, but tbh I can probably live with skipping over those altogether.

I've been thinking of spinning up an instance of paperless-ngx and stuffing them all in there so I can let it index the contents including using OCR, then use it's search feature; but that also seems a tad overkill.

I'm wondering if you fine folks have any better ideas. What do you think?

 

What do you prefer to use for a password manager?

How well does it work on mobile? (specifically, using autofill on android 14)

I'm currently using Vaultwarden; but the android app, which is where I'm using it 95% of the time, has always been a bit flakey getting autofill to popup. Now it's decided to stop working entirely; so I'm going to look around at some alternatives for now.

/edit:

Well, idk what happened.

I spent about 30min trying different things: switched androids autofill settings to another app, changed them back, cleared app data, force stopped everything relevant, re-installed bitwarden, restarted the device, messed with accessibility; nothing seemed to work. Bitwarden adamantly refused to popup for autofill in anything I'd tried. (4-5 different sites in chrome, firefox, and duckduckgo. The openvpn app, Jerboa, my bank. Nothing worked. Absolutely 0 sign of autofill anywhere.)

I made this post and went for a walk.

Now suddenly autofill is working again.

I hate technology sometimes.

/edit again:

The best option I've seen so far: There is an 'autofill' QuickSettings button you can add to the notification tray that opens the vault and asks which item to fill with. (just like the 'open vault' inline autofill option). If inline isn't popping up, use that.

 

It's 2028; Trump has lost his bid for re-re-election. America has somehow held together as a single nation and succeeded in electing a new leader.

You've been tasked with designing and creating a sculpture/statue/art piece to commemorate the ordeal America has just survived.

What do you do/create?

Text/drawn art prefered, but you can post AI art if you really want. LMK if I'm posting this in the wrong place; happy to move it if I've picked wrong.

 

The banner image is completely broken:

And the server hosting the communities profile image has an expired ssl cert;

-1
submitted 10 months ago* (last edited 10 months ago) by Darkassassin07@lemmy.ca to c/selfhosted@lemmy.world
 

Are any of you aware of projects similar to DizqueTV; a HDHomeRun tuner simulator that creates simulated live tv channels? (Dizque depends on Plex integration and cannot be used without it)

I'm looking for a solution to create simulated 'tv' channels by defining local content to be played on a schedule. Ideally just selecting a few shows to be played, mixed together. These channels would then be added to Emby/Plex/Jellyfin for users to tune into just like regular livetv.

I've been keeping an eye on Dizque for over a year now awaiting plex independence, but I don't think that'll be anytime soon. Wondering if there's alternatives.

/edit; should probably link the project I'm talking about...

https://github.com/vexorian/dizquetv

 

In the last couple of weeks, I've started getting this error ~1/5 times when I try to open one of my own locally hosted services.

I've never used ECH, and have always explicitly restricted nginx to TLS1.2 which doesn't support it. Why am I suddenly getting this, why is it randomly erroring, then working just fine again 2min later, and how can I prevent it altogether? Is anyone else experiencing this?

I'm primarily noticing it with Ombi. I'm also mainly using Chrome Android for this. But, checking just now; DuckDuckGo loads the page just fine everytime, and Firefox is flat out refusing to load it at all.

Firefox refuses to show the cert it claims is invalid, and 'accept and continue' just re-loads this error page. Chrome will show the cert; and it's the correct, valid cert from LE.

There's 20+ services going through the same nginx proxy, all using the same wildcard cert and identical ssl configurations; but Ombi is the only one suddenly giving me this issue regularly.

The vast majority of my services are accessed via lan/vpn; I don't need or want ECH, though I'd like to keep a basic https setup at least.

Solution: replace local A/AAAA records with a CNAME record pointing to a local only domain with its own local A/AAAA records. See below comments for clarification.

0
submitted 2 years ago* (last edited 2 years ago) by Darkassassin07@lemmy.ca to c/selfhosted@lemmy.world
 

After almost a year of repeated emails stating the transition from Google Domains will have no effect on customers, no action is required; I just got this email:

Update Dynamic DNS records Hi there, As previously communicated, Squarespace has purchased all domain name registrations and related customer accounts from Google Domains. Customers are in the process of being moved to Squarespace Domains, but before we migrate your domain [redacted] we wanted to inform you that a feature you use, Dynamic DNS (DDNS), will not be supported by Squarespace.

So apparently SquareSpace will be entirely useless to me and I've got "as soon as 30 days" to move.

Got any suggestions for good registrars to migrate to?

(it's a .pw domain if that matters)

/edit. I'm a moron.

I already use cloudflare as my name server, Google/SquareSpace only handles the registration.

I'll be fine. Thanks for the help everyone!

view more: next ›