eskuero

joined 2 years ago
[–] eskuero@lemmy.fromshado.ws 1 points 1 week ago

Doesnt have a dashboard per-se for centralized administration. It has a web ui to manually create create/upload collections. I personally use it a very simplistic way and just reupload an updated .vcf file with all my contacts from time to time.

About user management, I dont know how you installed radicale but they have this docs https://radicale.org/v3.html#authentication

[–] eskuero@lemmy.fromshado.ws 3 points 1 month ago* (last edited 1 month ago)

Yes I do. I cooked a small python script that runs at the end of every daily backup

import subprocess
import json
import os

# Output directory
OUTPUT_DIR = "/data/dockerimages"
try:
        os.mkdir(OUTPUT_DIR)
except:
        pass

# Grab all the docker images. Each line a json string defining the image
imagenes = subprocess.Popen(["docker", "images", "--format", "json"], stdout = subprocess.PIPE, stderr = subprocess.DEVNULL).communicate()[0].decode().split("\n")

for imagen in imagenes[:-1]:
        datos = json.loads(imagen)
        # ID of the image to save
        imageid = datos["ID"]
        # Compose the output name like this
        # ghcr.io-immich-app-immich-machine-learning:release:2026-01-28:3c42f025fb7c.tar
        outputname = f"{datos["Repository"]}:{datos["Tag"]}:{datos["CreatedAt"].split(" ")[0]}:{imageid}.tar".replace("/", "-")
        # If the file already exists just skip it
        if not os.path.isfile(f"{OUTPUT_DIR}/{outputname}"):
                print(f"Saving {outputname}...")
                subprocess.run(["docker", "save", imageid, "-o", f"{OUTPUT_DIR}/{outputname}"])
        else:
                print(f"Already exists {outputname}")
[–] eskuero@lemmy.fromshado.ws 3 points 1 month ago

26 tho this include multi container services like immich or paperless who have 4 each.

59
submitted 2 months ago* (last edited 2 months ago) by eskuero@lemmy.fromshado.ws to c/selfhosted@lemmy.world
 
  • A different device from your home server?
  • On the same home server as the services but directly on the host?
  • On the same home server as the services but inside some VM or container?

Do you configure it manually or do you use some helper/interface like WGEasy?

I have been personally using wgeasy but recently started locking down and hardening my containers and this node app running as root is kinda...

 

I'm talking not only about trusting the distribution chain but about the situation where some services dont rebuild their images using updated bases if they dont have a new release.

So per example if the particular service latest tag was a year ago they keep distributing it with a year old alpine base...

[–] eskuero@lemmy.fromshado.ws 8 points 3 months ago (1 children)

I run changedetection and monitor the samples .yml files projects usually host directly at their git repos

[–] eskuero@lemmy.fromshado.ws 1 points 3 months ago

Bring back my computer as well

[–] eskuero@lemmy.fromshado.ws 2 points 3 months ago (2 children)

people say go back in time to pick the correct lotto number

I say go back in time and sell my 8TB disk for 80 billion

[–] eskuero@lemmy.fromshado.ws 3 points 4 months ago

For the price of the car I would expect the SSD drive to grow wheels and be able to actually drive it

[–] eskuero@lemmy.fromshado.ws 2 points 5 months ago

I tried it in the past and it felt too heavy for my use case. Also for some reason the sidebar menu doesn't show all the items at all times but instead keeps only showing the ones related to the branch you just went into.

Also it seems pretty dead updates wise

Mdbook is really nice if you mind the lack of dyanimic editing in a web browser

[–] eskuero@lemmy.fromshado.ws 17 points 8 months ago (4 children)

I always run headscale on my own server for my own network.

[–] eskuero@lemmy.fromshado.ws 2 points 9 months ago

That's not included VAT kek

And for some reason it's always been this pricy with serverpartdeals, other stuff from USA not so much

[–] eskuero@lemmy.fromshado.ws 7 points 9 months ago* (last edited 9 months ago) (3 children)

Server part deals have great prices on the hardware.

But the shipping to my location for some reason is 120€, a 50% increase of the product wtf

is the warehouse in the moon or what!

[–] eskuero@lemmy.fromshado.ws 1 points 11 months ago

The easiest way by far is downloading an existing dump from kiwix

Per example wikipedia_en_all_nopic_2024-06.zim is only 54GB since it only contains text. Then via docker you could use this compose file where you have your .zim files in the wikis volume:

services:
  kiwix:
    image: ghcr.io/kiwix/kiwix-serve
    container_name: kiwix_app
    command: '*'
    ports:
      - '8080:8080'
    volumes:
      - "/wikis:/data"
    restart: always

Theorically you can actually one of the wikipedia database dumps with mediawiki but I don't known of any easy plug and play guide

view more: next ›