this post was submitted on 18 Sep 2025
20 points (88.5% liked)

Selfhosted

51556 readers
188 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

So I started the Google Takeout process because I want to move my photos from Google Photos to Immich.

Google emailed me saying the archives are ready... uh... I have to download 81 zip files, each 2GB big... 😬

Is there an easy way to download all of these files? Or do I have to click "download" 81 times and hope the downloads don't get interrupted?

you are viewing a single comment's thread
view the rest of the comments
[–] skoberlink@lemmy.world 7 points 1 day ago

I have tried to solve this many times as I want to regularly back up my Google content - mostly the images for the same purpose you mention.

Unfortunately there is no good solution I've ever come up with or found. I even looked into scripting with something like puppeteer. It requires regular confirmation of your authentication and I just haven't found a good way to solve that since there's no API access. It also won't let you use any cli tools like wget. You could probably figure out how to pull some token or cookie to give to the cli but you'd have to do it so often that its more of a pain than just manually downloading.

My solution currently is to run a firefox browser in a container on my server to download them. It acts as sort of a session manager (like tmux or zellij for command line) so that if the PC I'm using goes to sleep or something, the downloads continue. Then I just check in occasionally through the day. Plus I wanted them on the server anyway, at the end of the day. Downloading them there directly saves me having to then transfer to the server.

Switching to .tgz will let you make up to 50GB files which at least means fewer iterations and longer time between interactions (so I can actually do something useful in the meantime).

I sincerely hope someone proves me wrong and has a way to do this but I've searched a lot. I know other people want to solve it but I've never seen anyone with a solution.