Running Jupyter on a remote docker container via SSH

In my research I work with Machine Learning/Deep Learning algorithms, which I mostly develop using Python. As such, I find it useful to test different frameworks (such as Keras, PyTorch, Tensorflow…). This lead me to using Docker, and in particular ufoym/deepo image as it provides a very complete environment.

I use a remote machine with GPUs, where I have Docker installed. When programming and testing some code snippets, I find it really helpful to use Jupyter. I had previously used Jupyter locally… but question here was how could I remotely use Jupyter running on a another machine. And on top of that, what if I was using a Docker container on my remote machine?

In this tutorial I will try to share my experience in how I solved this puzzle and made it possible to locally use the Jupyter web-app running the code in a remote machine.

My working environment

Let me first illustrate my working environment set up.

  1. I use my regular laptop as the host machine.
  2. I connect to the server of my department (remote) via ssh.
  3. I then connect to a specific machine within the server which has GPUs.
  4. Finally I start the docker container in that machine.

In my particular case, the machine I work on is the one labeled as GPU in the above picture, which happens to be only accessible from another machine within the server. Hence the double ssh… But in your case you might be able to connect directly to your GPU machine from the outer world.

It sounds like messy, but bear with me, you just need to organise it once and for all.



Make sure to have docker installed in your remote machine. If you are unfamiliar with Docker, please check some of the resources available (I found this and this to be helpful). If you are aiming for a GPU-support environment, check next.


If your remote machine has GPUs you might want to exploit this fact. Make sure you have the NVIDIA driver installed. You also need to install nvidia-docker. Note that by installing nvidia-docker we automatically install the last stable release of docker-ce, hence you don’t need to explicitly install docker before.

Setting up the connection

host — remote

Jupyter Notebook runs on a certain port on the machine. Hence, the basic idea would be to make that port reachable from your host machine. Luckily, ssh provides the -L option to specify port forwarding.

$ ssh -L <host port>:localhost:<remote port> user@remote

In my case I use port 9999 in both ends, namely <host port> = <remote port> = 9999.

remote — GPU

Here the approach is exactly the same as before.

$ ssh -L <remote port>:localhost:<GPU port> user@GPU

Again, I use <host port> = <remote port> = 9999.

GPU — docker

This step is slightly different. First, you need to create a docker container using some docker image of your preference. In my case, as aforementioned, I am using ufoym/deepo. Since I want GPU-support, I will use nvidia-docker to create the container.

$ nvidia-docker run -it \
-p <GPU port>:<container port> \
--name <container name> \
ufoym/deepo bash

Note the option -p , which tells the docker container to do port forwarding. This way, we can get access to apps running in the docker on a certain port from the outside world. Here, I also use <host port> = <remote port> = 9999.

By the way, an extremely useful option when creating a container is -v, which allows you to access machine files from within the docker container.

Running Jupyter

Well, once all the tunneling is set up, we can start our jupyter app. We will use ip and the same port we used when creating the docker container instance. Also, in my case, I use the option --allow-root , as I am root in my container and Jupyter won’t run unless I use this option.

$ jupyter notebook --ip --port <container port> --allow-root

Oh, and if you prefer the new cool jupyter lab, just use the following command instead.

$ jupyter lab --ip --port <container port> --allow-root

Now, on my host, I simply go to localhost:9999 and voila, there you go.

Side notes

Enable more ports

If you have a similar working environment, I would recommend to enable more ports for remote access. You can simply add those to your ssh command using -L <local port>:localhost:<remote port> . This way, if you ever happen to run other services on some ports in your docker machine you can easily access them remotely.

Automate ssh login

As you add more options to your ssh command, it increases it size. An option is to define a host under ~/.ssh/config file. In my case I added a user <name>:

Host <name>
Hostname <remote IP>
Port <remote port for SSH (typically 22)>
IdentityFile <path to rsa key>
LocalForward 9999

Any error?

A good way to test out if your tunneling is working, is to use http.server from python. This way you can check for every stage if the ports were correctly forwarded. Use the following command to run a simple http server on a particular port.

$ python -m http.server <remote port>

Note that this works for python3, for python2 versions use python -m SimpleHTTPServer <remote port> instead.

If you get any other difficulties, please leave it in the comments and I’ll be glad to help you!