Centralized SSH login to Google Compute Engine instances
This article is part of my “short notes for myself, maybe useful for others too” series.
Related documentation:
Requirements
Enable global OS login feature
You can do it either globally on the project, then it’s valid for every instance:
# gcloud CLI
gcloud compute project-info add-metadata --metadata enable-oslogin=TRUE
# Terraform equivalent
resource "google_compute_project_metadata_item" "oslogin" {
project = "<project ID>"
key = "enable-oslogin"
value = "TRUE"
}
You can also disable 1-by-1 for specific VMs, and keep it on by default for everything else.
Or just enable it machine by machine:
gcloud compute instances add-metadata [INSTANCE_NAME] --metadata enable-oslogin=TRUE
Upload your SSH key to the project
gcloud compute os-login ssh-keys add --key-file .ssh/id_rsa.pub
Grant permissions to certain users or groups
There are 2 roles available:
roles/compute.osLogin
for non-root accessroles/compute.osAdminLogin
for users who get sudo
# gcloud CLI
gcloud projects add-iam-policy-binding $PROJECT \
--member=$ACCOUNT_EMAIL \
--role=roles/compute.osAdminLogin
# Terraform equivalent
resource "google_project_iam_member" "role-binding" {
project = "<project ID>"
role = "roles/compute.osAdminLogin"
member = "<ACCOUNT_EMAIL>"
}
Usage
Your user name will be always your full e-mail address, all dots and special characters, the @ sign replaced by an underscore.
Example:
daniel@megye.si
becomes ->daniel_megye_si
With direct connection to the internal network
If you have
- Cloud VPN
- Cloud Interconnect
- or a jump server
to reach the internal network directly:
ssh daniel_megye_si@<VM IP>
ssh daniel_megye_si@<VM FQDN># gcloud CLI
gcloud compute ssh --internal-ip <INSTANCE_NAME>
Every other case
You’re going to need to have public IP on each machine + firewall rule allowing SSH from Internet.
gcloud compute ssh <INSTANCE_NAME>