this post was submitted on 15 May 2026
6 points (100.0% liked)

Self-hosting

4329 readers
12 users here now

Hosting your own services. Preferably at home and on low-power or shared hardware.

Also check out:

founded 4 years ago
MODERATORS
 

I've transferred over my Google backup multiple times. I have multiple "takeout" directories with duplicate data. Some of them are zip files too.

Is there a tool I can use to find and delete duplicate files? I tried letting Dupeguru run, but eventually it said I was out of memory, and it was not useful in finding or deleting anything.

Edit: I'm on Linux.

you are viewing a single comment's thread
view the rest of the comments
[–] BrianTheeBiscuiteer@lemmy.world 3 points 2 hours ago

I've been using jdupes for a week or so and it's worked well. It's got a few options for deduplication if you just want to reclaim space and organization isn't your priority (e.g. symlink/hardlink creation, block-level deduplication).