Jupyter+Keras+ Nvidia Docker with GPU on Google Cloud in an instant

As a machine learning engineer, making environments by myself to try machine learning task is really important. The more convenient public cloud services such as AWS and GCP is getting, The more important the ability to do that I think.

Today, I would like to describe a kit which I made to make an environment with jupyter notebook, keras and Nvidia docker with GPU on google cloud in an instant.


I wanted an environment to build deep learning models with jupyter and GPU. There were a couple of candidates to be that.docker-machine would be best for that. Unfortunately, we can’t make an GCP instance with GPUs with docker-machine yet. I wait and see the issue. Google Datalab also would be a great candidate for that. But I don’t know it has complete compatibility with jupyter notebook. However, either of them is not enough to satisfy my requirements.

GitHub Repository

As a result of my research to build a GCP with GPU, I decided to make one by myself. You can of course use it in an instant.


  • Make a GPU instance on GCP with nvidia docker
  • Run jupyter as a docker container in the GPU instance
  • Set a SSH tunnel to access jupyter from a web browser
  • Install python library to the container via pip (Optional)
  • Upload your files to the instance from your local machine
  • Download files from the instance to your local machine
  • Delete the instance

Make a GPU instance on GCP with nvidia docker

make create-instance allows us to make a GPU instance on GPU with nvidia docker, running a startup script. Before executing the command, I recommend you to edit Makefile to set INSTANCE_NAME to your instance name and GCP_PROJECT_ID to your GCP project ID. You can also set them when creating an instance. It takes 5 minutes or so to create an instance.

make create-instance \
INSTANCE_NAME="test-gpu-instance" \

The below YAML file is to create an anaconda environment for the docker container with jupyter, keras and so on. You don’t need to install them by your self, because the docker image already includes them.

Run jupyter as a docker container in the GPU instance

After creating an instance, you must run jupyter notebook with make run-jupyter.

make run-jupyter \
INSTANCE_NAME="test-gpu-instance" \

Set a SSH tunnel to access jupyter from a web browser

In order to access the jupyter which you launched, you have to have a SSH tunnel. When you got it, you can access http://localhost:18888 via a web browser on your local machine. When you don't set any value with make ssh-tunnel, the port is the default value. The default of JUPYTER_PORT is 18888.

make ssh-tunnel \
INSTANCE_NAME="test-gpu-instance" \
GCP_PROJECT_ID=xxx-xxx-xxx \

Install python library to the container via pip

If you would like to install additional python libraries to a docker container which jupyter is runnning, first add ones to requirements.txt , then execute make pip-install.

make run-jupyter \
INSTANCE_NAME="test-gpu-instance" \

Upload your files to the instance from your local machine

If you would like to upload your files which include notebooks and data, make upload-files allows us to do that, setting FROM to your path. The files will be uploaded under /src inthe GCP instalce. The directory is mounted by /src in a docker container.

make upload-files \
INSTANCE_NAME="test-gpu-instance" \
GCP_PROJECT_ID=xxx-xxx-xxx \

Download files from the instance to your local machine

When you save models under /src/outputs in a docker container, you can download them with make download-outputs . You have to set TO to a path where you want to download.

make download-outputs \
INSTANCE_NAME="test-gpu-instance" \
GCP_PROJECT_ID=xxx-xxx-xxx \

Delete the instance

Please don’t forget to delete the instance, when you finish making your models.

make delete-instance \
INSTANCE_NAME="test-gpu-instance" \


I made a kit to making an GCP instance with GPU, jupyter, keras and so on in an instant. I hope it would help you with your machine learning life.