Getting Started With NVIDIA GPU, Anaconda, TensorFlow and Keras on Arch Linux

Intro

This is actually a pretty simple setup. First, we will install NVIDIA drivers and CUDA, then we will install Anaconda, TensorFlow binaries for GPUs, and Keras.

Install NVIDIA drivers

Let’s first install the drivers and cuda:

yaourt -S nvidia nvidia-utils cuda

Now, you need to go to NVIDIA’s website and download the cuDNN tarball. Just place the file on your “Downloads” folder.

Next, we install the package with:

yaourt -S cudnn

This should install all NVIDIA dependencies. Let’s go now and install Anaconda.

yaourt -S anaconda
# add a pointer to anaconda's binaries to your path and load it on your open session
echo "export PATH="/opt/anaconda/bin:$PATH"" ~/.bash_profile
source ~/.bash_profile

At this point you can check your installation with anaconda --version && conda --version.

Let’s create an environment and install TensorFlow and Keras:

conda create -n deep-learning
source activate deep-learning
# at the time of writing this is the lastest binary for GPUs
# go to https://www.tensorflow.org/get_started/os_setup
# and grab the equivalent "linux, gpu, python 3.5"
export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/gpu/tensorflow_gpu-0.12.0-cp35-cp35m-linux_x86_64.whl
pip3 install --upgrade $TF_BINARY_URL
pip3 install ipython

Now, you should be able to enter on an interactive Python session and load TensorFlow. Your output should look as below:

(deep-learning) [mimoralea@hash ~]$ ipython
Python 3.5.2 |Anaconda custom (64-bit)| (default, Jul 2 2016, 17:53:06)
Type "copyright", "credits" or "license" for more information.
IPython 5.1.0 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
In [1]: import tensorflow
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcublas.so locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcudnn.so locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcufft.so locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcuda.so.1 locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcurand.so locally

Great, now install Keras, configure it and test it.

pip3 install keras
mkdir ~/.keras
vi ~/.keras/keras.json

Add the keras config with these values:

{
"image_dim_ordering": "tf",
"epsilon": 1e-07,
"backend": "tensorflow",
"floatx": "float32"
}

Finally, test the end to end installation:

(deep-learning) [mimoralea@hash ~]$ ipython
Python 3.5.2 |Anaconda custom (64-bit)| (default, Jul 2 2016, 17:53:06)
Type "copyright", "credits" or "license" for more information.
IPython 5.1.0 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
In [1]: import keras
Using TensorFlow backend.
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcublas.so locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcudnn.so locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcufft.so locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcuda.so.1 locally
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcurand.so locally

Did you see “Using TensorFlow backend.”? That was Keras right there. Well done!

Let’s do a quick test:

First get out of your environement, update some packages and enable your environment again:

source deactivate
conda update conda
conda update anaconda
conda install nomkl numpy scipy scikit-learn numexpr
conda update mkl
source activate deep-learning

Now, let’s get that script running, you’ll like this, I promise.

# get into /tmp to garbage our work afterwards
cd /tmp
mkdir results
# download the style transfer script
wget https://raw.githubusercontent.com/fchollet/keras/master/examples/neural_style_transfer.py
# you need these packages to run this particular sample

Now, download a couple of images that you like on the same folder. Call it myimage.jpg for example. I added this one:

Also, download a couple of the most famous paintings of all time. Call it mystyle.jpg (or whatever you want, really). I actually got a pic of a painting my wife authored. Bonus point, you know… ;)

Now, the magic command:

python neural_style_transfer.py myimage.jpg mystyle.jpg results/myresult
ls -l results/

There you should see all the intermediate images that resulted from it. 100th image looks ok, though I would probably leave it for a little longer to get a better result.

You can also create a gif:

convert -delay 10 -loop 0 `ls -v *.png` animated.gif

For a point of comparison, you can see here the result Deepart.io return:

I like theirs better, but ours is not that bad!! You just have to play with the settings for a little longer.

Here is a sample of how the same picture looks with another style:

Not bad, right? You can see the chunks of the oil painting transferred into this photo. This is pretty amazing.

Alright, enjoy!