Cheat Sheets gcloud, bq, gsutil, kubectl for Google Cloud Associate Certificate
The gcloud command-line tool cheat sheet
The gcloud
cheat sheet
A roster of go-to gcloud
commands for the gcloud
tool, Google Cloud’s primary command-line tool.
Also included: introductory primer, understanding commands, and a printable PDF).
Cheat sheet
Getting started
Get going with the gcloud
command-line tool.
gcloud init
: Initialize, authorize, and configure thegcloud
tool.gcloud version
: Display version and installed components.gcloud components install
: Install specific components.gcloud components update
: Update your Cloud SDK to the latest version.gcloud config set project
: Set a default Google Cloud project to work on.gcloud info
: Display currentgcloud
tool environment details.
Help
Cloud SDK is happy to help.
gcloud help
: Search thegcloud
tool reference documents for specific terms.gcloud feedback
: Provide feedback for the Cloud SDK team.gcloud topic
: Supplementary help material for non-command topics like accessibility, filtering, and formatting.
Personalization
Make the Cloud SDK your own; personalize your configuration with properties.
gcloud config set
: Define a property (like compute/zone) for the current configuration.gcloud config get-value
: Fetch value of a Cloud SDK property.gcloud config list
: Display all the properties for the current configuration.gcloud config configurations create
: Create a new named configuration.gcloud config configurations list
: Display a list of all available configurations.gcloud config configurations activate
: Switch to an existing named configuration.
Credentials
Grant and revoke authorization to Cloud SDK
gcloud auth login
: Authorize Google Cloud access for thegcloud
tool with Google user credentials and set current account as active.gcloud auth activate-service-account
: Likegcloud auth login
but with service account credentials.gcloud auth list
: List all credentialed accounts.gcloud auth print-access-token
: Display the current account's access token.gcloud auth revoke
: Remove access credentials for an account.
Projects
Manage project access policies
gcloud projects describe
: Display metadata for a project (including its ID).gcloud projects add-iam-policy-binding
: Add an IAM policy binding to a specified project.
Identity & Access Management
Configuring Cloud Identity & Access Management (IAM) preferences and service accounts
gcloud iam list-grantable-roles
: List IAM grantable roles for a resource.gcloud iam roles create
: Create a custom role for a project or org.gcloud iam service-accounts create
: Create a service account for a project.gcloud iam service-accounts add-iam-policy-binding
: Add an IAM policy binding to a service account.gcloud iam service-accounts set-iam-policy-binding
: Replace existing IAM policy binding.gcloud iam service-accounts keys list
: List a service account's keys.
Docker & Google Kubernetes Engine (GKE)
Manage containerized applications on Kubernetes
gcloud auth configure-docker
: Register thegcloud
tool as a Docker credential helper.gcloud container clusters create
: Create a cluster to run GKE containers.gcloud container clusters list
: List clusters for running GKE containers.gcloud container clusters get-credentials
: Updatekubeconfig
to getkubectl
to use a GKE cluster.gcloud container images list-tags
: List tag and digest metadata for a container image.
Virtual Machines & Compute Engine
Create, run, and manage VMs on Google infrastructure
gcloud compute zones list
: List Compute Engine zones.gcloud compute instances describe
: Display a VM instance's details.gcloud compute instances list
: List all VM instances in a project.gcloud compute disks snapshot
: Create snapshot of persistent disks.gcloud compute snapshots describe
: Display a snapshot's details.gcloud compute snapshots delete
: Delete a snapshot.gcloud compute ssh
: Connect to a VM instance by using SSH.
Serverless & App Engine
Build highly scalable applications on a fully managed serverless platform
gcloud app deploy
: Deploy your app's code and configuration to the App Engine server.gcloud app versions list
: List all versions of all services deployed to the App Engine server.gcloud app browse
: Open the current app in a web browser.gcloud app create
: Create an App Engine app within your current project.gcloud app logs read
: Display the latest App Engine app logs.
Miscellaneous
Commands that might come in handy
gcloud kms decrypt
: Decrypt ciphertext (to a plaintext file) using a Cloud Key Management Service (Cloud KMS) key.gcloud logging logs list
: List your project's logs.gcloud sql backups describe
: Display info about a Cloud SQL instance backup.gcloud sql export sql
: Export data from a Cloud SQL instance to a SQL file.
Introductory primer
A quick primer for getting started with the gcloud command-line tool.
Installing the Cloud SDK
Install the Cloud SDK with these installation instructions.
Flags, arguments, and other wondrous additions
Arguments can be Positional args or Flags
- Positional args: Set after command name; must respect order of positional args.
- Flags: Set after positional args; order of flags doesn’t matter.
- A flag can be either a:
- Name-value pair (
--foo=bar
), or - Boolean (
--force/no-force
). - Additionally, flags can either be:
- Required
- Optional: in which case, the default value is used, if the flag is not defined
Global flags
Some flags are available throughout the gcloud command-line tool experience, like:
--help
: For when in doubt; display detailed help for a command.--project
: If using a project other than the current one.--quiet
: Disabling interactive prompting (and applying default values for inputs).--verbosity
: Can set verbosity levels atdebug
,info
,warning
,error
,critical
, andnone
.--version
: Displaygcloud
version information.--format
: Set output format asconfig
,csv
,default
,diff
,disable
,flattened
,get
,json
,list
,multi
,none
,object
,table
,text
,value
, oryaml
.
Cleaning up results
Extricate the most from your output with the filter, format, limit, and sort-by flags.
For Compute Engine instances with prefix us
and not machine type f1-micro
:
gcloud compute instances list --filter="zone ~ ^us AND -machineType:f1-micro"
For a list of projects created on or after 15 January 2018, sorted from oldest to newest, presented as a table with project number, project id and creation time columns with dates and times in local timezone:
gcloud projects list --format="table(projectNumber,projectId,createTime.date(tz=LOCAL))"
--filter="createTime>=2018-01-15T12:00:00" --sort-by=createTime
For a list of ten Compute Engine instances with a label my-label
(of any value):
gcloud compute instances list --filter="labels.my-label:*" --limit=10
Understanding commands
The underlying patterns for gcloud
commands; to aid self-discovery of commands.
Finding gcloud commands
The gcloud command-line tool is a tree; non-leaf nodes are command groups and leaf nodes are commands. (Also, tab completion works for commands and resources!)
Most gcloud commands follow the following format:
gcloud + release level (optional) + component + entity + operation + positional args + flags
For example: gcloud + compute + instances + create + example-instance-1 + --zone=us-central1-a
Release level
Release Level refers to the command’s release status.
Example: alpha
for alpha commands, beta
for beta commands, no release level needed for GA commands.
Component
Component refers to the different Google Cloud services.
Example: compute
for Compute Engine, app
for App Engine, etc.
Entity
Entity refers to the plural form of an element or collection of elements under a component.
Example: disks
, firewalls
, images
, instances
, regions
, zones
for compute
Operation
Operation refers to the imperative verb form of the operation to be performed on the entity.
Example: Common operations are describe
, list
, create/update
, delete/clear
, import
, export
, copy
, remove
, add
, reset
, restart
, restore
, run
, and deploy
.
Positional args
Positional args refer to the required, order-specific arguments needed to execute the command.
Example: <INSTANCE_NAMES>
is the required positional argument for gcloud compute instances create
.
Flags
Flags refer to the additional arguments, --flag-name(=value)
, passed in to the command after positional args.
Example: --machine-type=<MACHINE_TYPE>
and --preemptible
are optional flags for gcloud compute instances create
.
Run SSH :
https://cloud.google.com/compute/docs/tutorials/service-account-ssh
gcloud iam service-accounts create ssh-account --project $PROJECT_ID \
--display-name "ssh-account"gcloud compute networks create ssh-example --project $PROJECT_IDgcloud compute firewall-rules create ssh-all --project $PROJECT_ID \
--network ssh-example --allow tcp:22gcloud compute instances create target --project $PROJECT_ID \
--zone us-central1-f --network ssh-example \
--no-service-account --no-scopes \
--machine-type f1-micro --metadata=enable-oslogin=TRUEgcloud compute instances add-iam-policy-binding target \
--project $PROJECT_ID --zone us-central1-f \
--member serviceAccount:ssh-account@$PROJECT_ID.iam.gserviceaccount.com \
--role roles/compute.osAdminLogingcloud compute instances create source \
--project $PROJECT_ID --zone us-central1-f \
--service-account ssh-account@$PROJECT_ID.iam.gserviceaccount.com \
--scopes https://www.googleapis.com/auth/cloud-platform \
--network ssh-example --machine-type f1-microgcloud compute ssh source --project $PROJECT_ID --zone us-central1-fsudo apt update && sudo apt install python-pip -y && pip install --upgrade google-api-python-clientcurl -O https://raw.githubusercontent.com/GoogleCloudPlatform/python-docs-samples/master/compute/oslogin/service_account_ssh.pypython service_account_ssh.py \
--cmd 'sudo apt install cowsay -y && cowsay "It works!"' \
--project [PROJECT_ID] --zone us-central1-f --instance target
⋮
___________
It works!
-----------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
bq
https://cloud.google.com/bigquery/docs/reference/bq-cli-reference
Creating a dataset
In the Cloud Shell, use the bq mk
command to create a dataset called bq_load_codelab
.
bq mk bq_load_codelab
Viewing dataset properties
Verify that you created the dataset by viewing the dataset’s properties with the bq show
command.
bq show bq_load_codelab
You should see output similar to:
Dataset my-project:bq_load_codelab Last modified ACLs Labels
----------------- -------------------- --------
15 Jun 14:12:49 Owners:
projectOwners,
your-email@example.com
Writers:
projectWriters
Readers:
projectReaders
Use the bq load
command to load your CSV into a BigQuery table.
bq load \
--source_format=CSV \
--skip_leading_rows=1 \
bq_load_codelab.customer_transactions \
./customer_transactions.csv \
id:string,zip:string,ttime:timestamp,amount:numeric,fdbk:float,sku:string
You used the following options:
--source_format=CSV
uses the CSV data format when parsing the data file.--skip_leading_rows=1
skips the first line in the CSV file, because it is a header row.bq_load_codelab.customer_transactions
The first positional argument defines which table the data should be loaded into../customer_transactions.csv
The second positional argument defines which file to load. In addition to local files, thebq load
command can load files from Google Cloud Storage, withgs://my_bucket/path/to/file
URIs.schema
A schema can be defined in a JSON schema file or as a comma-separated list. This codelab uses a comma-separated list for simplicity.
The customer_transactions
table uses the following schema:
id:string
A customer identifier.zip:string
A United States postal zip code.ttime:timestamp
The date and time that the transaction took place.amount:numeric
The amount of a transaction. A numeric column stores data in decimal form, useful for monetary values.fdbk:float
The rating from a feedback survey about the transaction.sku:string
A an identifier for the item purchased.
Get the table details
Verify that the table loaded by showing the table properties.
bq show bq_load_codelab.customer_transactions
Output:
Table my-project:bq_load_codelab.customer_transactions Last modified Schema Total Rows Total Bytes
----------------- --------------------- ------------ -------------
15 Jun 15:13:55 |- id: string 3 159
|- zip: string
|- ttime: timestamp
|- amount: numeric
|- fdbk: float
|- sku: string
Now that your data is loaded, you can query it by using the BigQuery Web UI, the bq
command, or the API. Your queries can join your data against any dataset (or datasets, so long as they all are in the same location) that you have permission to read.
Run a standard SQL query that joins your dataset with the zipcode public dataset and sums up transactions by U.S. state. Use the bq query
command to execute the query.
bq query --nouse_legacy_sql '
SELECT SUM(c.amount) AS amount_total, z.state_code AS state_code
FROM `bq_load_codelab.customer_transactions` c
JOIN `bigquery-public-data.utility_us.zipcode_area` z
ON c.zip = z.zipcode
GROUP BY state_code
'
This command should output something like:
Waiting on bqjob_r26...05a15b38_1 ... (1s) Current status: DONE
+--------------+------------+
| amount_total | state_code |
+--------------+------------+
| 53.6 | NY |
| 7.18 | TX |
+--------------+------------+
The query you just ran used both a public dataset and your own private dataset. Learn more by reading this commented version of the same query:
#standardSQL
SELECT
/* Total of all transactions in the state. */
SUM(c.amount) AS amount_total, /* State corresponding to the transaction's zipcode. */
z.state_code AS state_code/* Query the table you just constructed.
* Note: If you omit the project from the table ID,
* the dataset is read from your project. */
FROM `bq_load_codelab.customer_transactions` c/* Join the table to the zipcode public dataset. */
JOIN `bigquery-public-data.utility_us.zipcode_area` z/* Find the state corresponding to the transaction's zipcode. */
ON c.zip = z.zipcode/* Group over all transactions by state. */
GROUP BY state_code
Optionally, delete the dataset you created with the bq rm
command. Use the -r
flag to remove any tables it contains.
bq rm -r bq_load_codelab
gsutil
gsutil mb -b on -l us-east1 gs://my-awesome-bucket/
gsutil cp Desktop/kitten.png gs://my-awesome-bucket
gsutil cp gs://my-awesome-bucket/kitten.png Desktop/kitten2.png
gsutil cp gs://my-awesome-bucket/kitten.png gs://my-awesome-bucket/just-a-folder/kitten3.png
gsutil ls gs://my-awesome-bucketUse the gsutil ls command, with the -l flag to get some details about a one of your images:gsutil ls -l gs://my-awesome-bucket/kitten.png--------------------------------------------Use the gsutil iam ch command to grant all users permission to read the images stored in your bucket:gsutil iam ch allUsers:objectViewer gs://my-awesome-bucket-----------------------------------------------
To remove this access, use the command:gsutil iam ch -d allUsers:objectViewer gs://my-awesome-bucketgsutil iam ch user:jane@gmail.com:objectCreator,objectViewer gs://my-awesome-bucketgsutil iam ch -d user:jane@gmail.com:objectCreator,objectViewer gs://my-awesome-bucketgsutil rm gs://my-awesome-bucket/kitten.pnggsutil rm -r gs://my-awesome-bucket------------------
Generate a new private key, or use an existing private key for a service account. The key can be in either JSON or PKCS12 format.For more information on private keys and service accounts, see Service Accounts.Use the gsutil signurl command, passing in the path to the private key from the previous step and the name of the bucket or object you want to generate a signed URL for.For example, using a key stored in the folder Desktop, the following command generates a signed URL for users to view the object cat.jpeg for 10 minutes.
gsutil signurl -d 10m Desktop/private-key.json gs://example-bucket/cat.jpegThe signed URL is the string beginning with https://storage.googleapis.com and will likely extend for several lines. This URL can be used by any person to access the associated resource (in this case cat.jpeg) for the designated time frame (in this case, 10 minutes).---------
gsutil lifecycle set [LIFECYCLE_CONFIG_FILE] gs://[BUCKET_NAME]example of
{
"lifecycle": {
"rule": [
{
"action": {"type": "Delete"},
"condition": {
"age": 30,
"isLive": true
}
},
{
"action": {"type": "Delete"},
"condition": {
"age": 10,
"isLive": false
}
}
]
}
}gsutil versioning set (on|off) gs://<bucket_name>...
gsutil versioning get gs://<bucket_name>...
kubectl
kubectl apply -f my-deployment.yaml
kubectl get pods
kubectl get service my-cip-service --output yaml
GKE
gcloud container clusters create cluster-name --num-nodes 30 \
--enable-autoscaling --min-nodes 15 --max-nodes 50 [--zone compute-zone]
In this command:
--num-nodes
specifies the number of nodes to create in each of the cluster's zones. The default is 3.--enable-autoscaling
indicates that autoscaling is enabled.--min-nodes
specifies the minimum number of nodes for the default node pool.--max-nodes
specifies the maximum number of nodes for the default node pool.--zone
specifies the compute zone in which the autoscaler should create new nodes.
gcloud container node-pools create pool-name --cluster cluster-name \
--enable-autoscaling --min-nodes 1 --max-nodes 5 [--zone compute-zone]
Enabling autoscaling for an existing node pool
gcloud container clusters update cluster-name --enable-autoscaling \
--min-nodes 1 --max-nodes 10 --zone compute-zone --node-pool default-pool
Set kubectl context
gcloud container clusters get-credentials <cluster-name>
List all container clusters
gcloud container clusters list
gcloud container clusters resize
List all container clustersgcloud container clusters list
Set kubectl contextgcloud container clusters get-credentials <cluster-name>
The following update command that enables Cloud Operations for GKE only shows the options needed for Google Cloud’s operations suite:
gcloud beta container clusters update [CLUSTER_NAME] \
— zone=[ZONE] — region=[REGION] \
— enable-stackdriver-kubernetes
The field [REGION] is the compute region of the cluster.
Note: You must use Cloud SDK version 248.0.0 or higher.
Alternatively, the following update command that enables Legacy Logging and Monitoring only shows the options needed for Google Cloud’s operations suite:
gcloud beta container clusters update [CLUSTER_NAME] \
— zone=[ZONE] — region=[REGION] \
— logging-service logging.googleapis.com \
— monitoring-service monitoring.googleapis.com
reading logs
gcloud logging read ‘logName:projects/YOUR_PROJECT_ID/logs/stderr AND resource.type=k8s_container AND resource.labels.cluster_name=shop-cluster AND resource.labels.namespace_name=default AND textPayload:”Sorry, we cannot process jcb credit cards. Only VISA or MasterCard is accepted.”’ — limit 10 — format json
Network
all networksgcloud compute networks list
Detail of one networkgcloud compute networks describe <network-name> --format json
Create networkgcloud compute networks create <network-name>
Create subnetgcloud compute networks subnets create subnet1 --network net1 --range 10.5.4.0/24
Get a static ipgcloud compute addresses create --region us-west2-a vpn-1-static-ip
List all ip addressesgcloud compute addresses list
Describe ip addressgcloud compute addresses describe <ip-name> --region us-central1
List all routesgcloud compute routes list
DNS
List of all record-sets in my_zonegcloud dns record-sets list --zone my_zone
List first 10 DNS recordsgcloud dns record-sets list --zone my_zone --limit=10
Firewall
List all firewall rulesgcloud compute firewall-rules list
List all forwarding rulesgcloud compute forwarding-rules list
Describe one firewall rulegcloud compute firewall-rules describe <rule-name>
Create one firewall rulegcloud compute firewall-rules create my-rule --network default --allow tcp:9200 tcp:3306
Update one firewall rulegcloud compute firewall-rules update default --network default --allow tcp:9200 tcp:9300
Images & Containers
all imagesgcloud compute images list
RDS
List all sql instancesgcloud sql instances list
Services
List my backend servicesgcloud compute backend-services list
List all my health check endpointsgcloud compute http-health-checks list
List all URL mapsgcloud compute url-maps list