Running Tensor Flow and Keras in Jupyter notebook via Docker

Alex Punnen
Better ML
Published in
1 min readFeb 27, 2019

Assuming that you have NVIDIA GPU in your machine and NVIDIA Drivers installed. Please read the belowbefore you install NVIDIA drivers

The easiest way to use TF is via the docker container; which uses the nvidia docker container.

If you do not need Keras you can use like below

Tip: If you are working on a remote GPU machine you can use the port forwarding option so that you can browse the Jupyter notebook from your local machine

ssh -L 8888:127.0.0.1:8888 root@<remotemachine ip>
----------
docker run --entrypoint=/bin/bash --env http_proxy<if you're behing one> --env https_proxy=<xyz> --runtime=nvidia -p 8888:8888 -it --rm -v /usr/alex/:/coding --net=host tensorflow/tensorflow:1.13.0rc1-gpu-jupyter
----------
cd /coding
jupyter notebook --alow-root
(do note the token)
-----------

And since port 8888 remote is mapped to local, you can fire up the webpage from your machine

http://127.0.0.1:8888/?token=<pass in the token copied from remote console>

You should be able to see jupyter notebook at http://127.0.0.1:8888

I have installed Keras and some other python libs on top of the official tensorflow_tfserving:latest-gpu Docker image and saved it as alexcpn/tfserving-dev-gpu.

Happy coding!

--

--

Alex Punnen
Better ML

SW Architect/programmer- in various languages and technologies from 2001 to now. https://www.linkedin.com/in/alexpunnen/