this post was submitted on 12 Jun 2023
3 points (100.0% liked)

Self Hosted - Self-hosting your services.

11230 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules

Important

Beginning of January 1st 2024 this rule WILL be enforced. Posts that are not tagged will be warned and if not fixed within 24h then removed!

Cross-posting

If you see a rule-breaker please DM the mods!

founded 3 years ago
MODERATORS
 

Hey folks, I have multiple VMs and personal machines across multiple cloud providers and I'm beginning to get frustrated with ssh key management. Each personal machine has it's own key so if I lose it or it's compromised I can just remove the key from the vms but it's starting to get tedious making sure everything is up to date and any new keys are added.

Are there any solutions out there that would help?

top 13 comments
sorted by: hot top controversial new old
[–] Puddy@feddit.de 2 points 1 year ago

Get a usb dongle that contains the key and handles the crypto without exposing the key to the host (eg yubikey is popular). This way your secret travels with you and you have one key for all. Be sure to have a script ready for revoking the key on all machines in case you loose it :)

[–] buhala@sopuli.xyz 1 points 1 year ago (2 children)

I suppose I'm questioning why you'd need several SSH keys to begin with. The only place you should have your private ones is on highly secure places (mine are on my password manager and pc, encrypted). Would there ever be a case for you where only one of them gets breached? Surely if someone gains access to either of those places that's all your keys gone.

Asking because managing one key is easier than managing tons for me.

[–] dogmuffins@lemmy.ml 1 points 1 year ago

This. You can either have key-per-device or key-per-person. I don't really share access with anyone so key-per-person (just 1 for me) is the way.

If I did have multiple keys, they would be kept in the same place, so anyone getting access to one would get access to all of them anyway.

[–] mbell@lemmy.remotelab.uk 1 points 1 year ago

I guess it's just the way I've been using it for years and years. I've been remote working for abour a decade so I've been in places where it could have been possible to grab my keys if I'd left a laptop unlocked, not likely though.

[–] mark@social.cool110.xyz 1 points 1 year ago (1 children)

@mbell You need a single source of truth to store the public keys in, then set AuthorizedKeysCommand in the SSHD config to a script that looks them up from there. I use this one for LDAP.

[–] mbell@lemmy.remotelab.uk 1 points 1 year ago

This is an interesting solution, I don't have LDAP running at the moment but it's on my list of things to add to my setup.

[–] UsefulIdiot@sh.itjust.works 1 points 1 year ago (1 children)

Couldn’t Keybase help with this?

[–] mbell@lemmy.remotelab.uk 1 points 1 year ago

https://keybase.io/ this keybase? It seems to be a chat and file sharing thing, irc didn't it used to be an identity based things years ago?

[–] Stetsed@lemmy.one 1 points 1 year ago

So you do have solutions like teleport which handle SSH authentication but they require external tools. My advice would be to add password authentication to your SSH key and then just use the same SSH key everywhere. It’s not as secure which is true but in my opinion as long as you have good security you will be fine.

(Another solution would be to make an ansible script which generates a new key every X days and distributes them to the servers using acces given by the old key and then removing that old key from authorized keys)

[–] burnus@lemmyrs.org 0 points 1 year ago (1 children)

Have you considered creating Certificates for your keys? The basic idea is that you have a master CA key pair which you use to sign all your actual keys with (ideally both use and host keys). Than you only add a CA entry for the CA to the authorized_keys and known_hosts on every machine and now, whenever you add a user or host, you only need to sign its keys and everything else trusts it automatically. Add a CRL for key revocation to solve the lost-keys-problem and not you should have everything you wanted. All of this comes out of the box with OpenSSH, no extra tools required.

[–] mbell@lemmy.remotelab.uk 1 points 1 year ago

This sounds interesting will read up on it.

[–] themoonisacheese@sh.itjust.works 0 points 1 year ago (1 children)

Tailscale is a VPN built on top of wireguard that identifies you at the network level. This means that as long as you are connecting through your tailnet, you can configure your machines to let you in without an auth challenge (or with a 2fa challenge) through PAM modules. Afaik the auto-login part is beta but you could also run a second sshd that only ever listens to the interface of the VPN and lets you in with no password. Tailscale allows you to set network-level access permissions, per user (though having more than 3(?) Users costs money) and support logging in through ubikey-like keys or oauth.

Or, you can set up a git repo that hosts your Public key, and Cron jobs that pull it every hour/day/whatever. It's safe to publish your Public key in this manner, and if you somehow lose the private key you may simply update the repo and wait for the change to propagate.

Or, Ansible is particularly well suited to public key management, but it's not really automated (you have to run it manually and it connects to every server) so you may end up in a situation where you would like the server to pull valid keys by itself and can't log in manually until it has done so. If that is acceptable to you, Ansible also enables you to manage a lot of your infra (automatic installs and enabling of common daemons such as monitoring comes to mind).

[–] mbell@lemmy.remotelab.uk 1 points 1 year ago

Tailscale sounds like an interesting solution, I already have wireguard vpn which runs on all my personal devices.

At the moment github is my single source of truth for pub keys so having that setup might work, I could also automate the cronjob via ansible when I setup default config on new vms