Advance level:

On a Remote Linux Server Using Jupyter Notebook and Docker for TensorFlow 2.0 Keras

caner kilinc
Nov 4 · 5 min read

Question: How to install and use TensorFlow 2.0 on a remote Linux server?
Answer: it is straight forward with docker tool and port forwarding!

TensorFlow installation is might be problematic indeed and time-consuming especially if you would like to use the GPU version on a remote server. You can check and assure if a GPU is available on the Linux machine e.g. the remote computation platform:

$ lspci | grep -i nvidia

Here are the results on my terminal window, since I have installed the required Nvidia GPU driver on the host server i.e. remote Linux machine:


Thanks to Docker, which nowadays simplifies the work of configuring development environment for developers and data scientists. In this article, as illustrated in the figure above, I am going to describe the followings:

  • how to run Jupyter Notebooks and user TensorFlow 2.0 deep learning library in a docker Linux-environment on a remote Linux Server and,
  • how to access to the remote docker TensorFlow 2.0 deep learning development/computation environment on your local machine via your favorite browser e.g. Chrome

Step 1- Run Jupyter Notebooks and TensorFlow 2.0 Application in Docker Linux-environment on a Remote Linux Server

First of all, as described we need to be able to access to the remote Linux server via ssh where we perform port forwarding. The command to do that is below:

$ ssh -L 8000:localhost:8000 

Once we are able to access to the remote Linux server via ssh and activated the port forwarding, we can install docker. Note that how to install Docker on a Linux machine is described . After the docker tool is installed, we need to find a Linux image where the necessary TensorFlow 2.0 deep learning libraries are installed. In this article, I am using this Linux image which has TensorFlow 2.0 GPU development libraries published by TensorFlow official .

Herewith the command below you can download and run a GPU-enabled TensorFlow image (may take a few minutes) and if you get an error due to the memory you can see the line below

$ sudo nvidia-docker run -it — rm tensorflow/tensorflow:latest-gpu python -c “import tensorflow as tf; tf.enable_eager_execution(); print(tf.reduce_sum(tf.random_normal([1000, 1000])))”

This is only when you want to clean the unused docker images to come over the memory issue

$ docker system prune — all — force #only run if you received memory issue as below in the image

Until now, you could be able to install the necessary TensorFlow 2.0 with GPU image. The next is to run the docker with the Linux image, we need to run the command below:

$ sudo nvidia-docker run -it -p 8000:8000 -v /home/user:/home tensorflow/tensorflow:latest-gpu bash

Success: TensorFlow 2.0 is now installed.


To run a different TensorFlow 2.0 image with Jupyter Notebook

# as an alternative you can also this image with jupyter notebook$ sudo nvidia-docker run -it -p 8000:8000 -v /home/user:/home tensorflow/tensorflow:latest-gpu-jupyter bash

A brief explanation of the command line above as following:

  • without the sudo rights, you cannot run docker
  • nvidia-docker run -it is the command to initiate docker with GPU power
  • 8000:8000 here we declare which docker port will be forwarded to which port of the remote server. Here, it is important to have the same port numbers in order to avoid confusion and for simplification. Thus we picked port number 8000 which is the same port that we used while establishing the SSH connection between the local machine and the remote Linux server. Similar to , we forward the docker Linux machine’s port 8000 to the remote Linux server’s port 8000 which will be forwarded to our local machine’s port 8000 as illustrated in the figure above
  • /home/user_name: /home, the user_name is the user name that used to log in into the remote Linux server as described in the previous article. By this declaration, we are matching the remote server’s home directory with the docker Linux machine’s home directory, so e.g. you can save and access the files on the remote Linux server.
  • bash command enables to run the interactive session o the terminal of docker Linux image, so you can run the commands like on the local machine

One we are in the docker Linux machine then we can start running the Jupyter Notebook as below. I suggest before you are running the Jupyter make sure you are in the right directory e.g. in the home directory or where the data files are stored

$ jupyter notebook --ip 0.0.0.0 --port 8000 --allow-root

Please note that the command above starts the Jupyter Notebook and returns authentication token for the Jupyter Notebook access. The token will be used at the end of the third step.

Step 2- Access to Remote Docker Linux Development/Computation Environment via Your Favorite Browser e.g. Chrome on Your Local Machine.

Now you are ready to start your browser e.g. Chrome on your local machine, and in the address bar type the URL below. In addition, you can now copy and paste the given token at the second step above — it is in the terminal window

I hope this quick description was use-full for you!

More Tips & Hints For The Readers:

  • Do not save data/files on the docker which might be crashed sometimes, rather than that you can save the files on the remote Linux server.
  • Please note that if you are trying to run docker and install a docker image on a virtual machine e.g. on Open Stack you will face memory issues thus you need to attach an external disk. Without the external disk, you will not be able to install docker and image
  • Also, note that TensorFlow is indeed selfish in terms of computation. Running TensorFlow consumes all the available computation i.e. GPU power. If you are sharing the machine with other users, while you are running the TensorFlow applications, the other users will not be able to use the GPU nor run TensorFlow applications in parallel.

in 5 mints: TensorFlow 2.0 in Practice

“Only when you explicitly understood something, you can briefly describe, simplify and even can teach it to your grandma “

caner kilinc

Written by

Caner has a ~ 10-years of industrial experience. Currently Caner serves as a Senior Data Scientist as part of AI projects

in 5 mints: TensorFlow 2.0 in Practice

“Only when you explicitly understood something, you can briefly describe, simplify and even can teach it to your grandma “

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade