this post was submitted on 15 Oct 2023
87 points (96.8% liked)

Selfhosted

40180 readers
1491 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

For the last two years, I've been treating compose files as individual runners for individual programs.

Then I brainstormed the concept of having one singular docker-compose file that writes out every single running container on my system... (that can use compose), each install starts at the same root directory and volumes branch out from there.

Then I find out, this is how most people use compose. One compose file, with volumes and directories branching out from wherever ./ is called.

THEN I FIND OUT... that most people that discover this move their installations to podman because compose works on different versions per app and calling those versions breaks the concept of having one singular docker-compose.yml file and podman doesn't need a version for compose files.

Is there some meta for the best way to handle these apps collectively?

you are viewing a single comment's thread
view the rest of the comments
[–] Mythnubb@lemm.ee 11 points 1 year ago (1 children)

As other have said, I have a root docker directory then have directories inside for all my stacks, like Plex. Then I run this script which loops through them all to update everything in one command.

for n in plex-system bitwarden freshrss changedetection.io heimdall invidious paperless pihole transmission dashdot
do
    cd /docker/$n
    docker-compose pull
    docker-compose up -d
done

echo Removing old docker images...
docker image prune -f
[–] pete_the_cat@lemmy.world 17 points 1 year ago (1 children)

Or just use the Watchtower container to auto-update them 😉

[–] DH10@feddit.de 7 points 1 year ago (2 children)

I don’t like the auto update function. I also use a script similar to the one op uses (with a .ignore file added). I like to be in control when (or if) updates happen. I use watchtower as a notification service.

[–] Mythnubb@lemm.ee 1 points 1 year ago (1 children)

Exactly, when it updates, I want to initiate it to make sure everything goes as it should.

[–] pete_the_cat@lemmy.world 1 points 1 year ago

Nothing off mine is that important that I couldn't create/rollback the container if it does happen to screw up.

[–] chiisana@lemmy.chiisana.net 1 points 1 year ago

I scream test myself… kidding aside, I try to pin to major versions where possible — Postgres:16-alpine for example will generally not break between updates and things should just chip along. It’s when indie devs not tagging anything other than latest or adhere to semantic versioning best practices where I keep watchtower off and update once in a blue moon manually as a result.