Can Artificial Intelligence Dream of Cannabis?
-Cannabis Data Science Group, by Juan Cruz Rodriguez-
Remember DeepDream? Yes, the Deep Neural Network (NN) that transformed pictures by adding psychedelic “dream” effects. If not, take a look at Wikipedia’s article.
Non-technical DeepDream’s TL;DR
For a slightly more technical explanation, we recommend the Keras blog.
DeepDream uses a deep convolutional network, named “Inception”, which was trained with the aim of automatically classifying images. DeepDream author Alexander Mordvintsev wanted to know what calculations/transformations the model was performing on the input image when trying to classify it.
Inception’s NN architecture contains ten independent concatenate layers (“Concat”; red boxes in the following figure). By editing the weights of different layers, after giving an input image, we could visually check the transformation effect of each layer. To Alexander’s surprise, the results were astonishing, finding artistic psychedelic alterations in the images.
Inception was trained with a large number of images, however, animal images were predominant. And that is why in the dreams generated by DeepDream, it is common for effects related to animals to appear. This is where our interest arises to think “what will happen if we retrain Inception’s architecture with images of cannabis?” 🤯.
Retraining the Architecture
For this project, the NN architecture of Inception was retrained so that it classifies cannabis images within four sub-categories. A total of 2822 images were used, tagged as “flower” 💐 (2243 images), “plant” 🌱 (90), “pre-roll” 🚬 (257), and “seeds” 🌰 (232), as exemplified by the following images respectively:
Analyzing Each Layer Individually
Once the model, called “WeedCeption_v1” was trained, a procedure similar to that developed in DeepDream was followed. In this first stage, to analyze the contributions of each concatenate layer, on each iteration, only one weight (from 1 to 10) for one layer was taken into account. Starting from the original image, the images generated automatically by the model were obtained after using each layer and weight. Next, for each layer, the image generated by one of the evaluated weights (representative weight was chosen by me) is shown.
This is the original input image given to the WeedCeption_v1 trained model:
Analyzing the effects of layer 1, there are no clear patterns related to the 4 categories used for training. With some creativity, some flower-like textures can be appreciated.
Now, in layers 2 and 3, more marked patterns begin to be noticed. Flower textures are more recognizable, as well as seed-like figures begin to appear. Also, some elongated figures appear, which could have some relation to pre-rolls.
Layer 4 does not present easily recognizable patterns, it could be interpreted that figures related to pre-rolls or plants influence this layer.
In layer 5, again no noticeable patterns are observed, although a few seed-related figures can be seen mainly in the upper part of the red heart.
Layer 6 presents well-marked patterns of flower textures, especially when looking at the image of the red heart.
In layers 7 and 8, no easily recognizable patterns can be seen. Although some quite interesting psychedelic effects are created.
Finally, in layers 9 and 10, some beautiful flower textures are present.
These are our personal appreciations of what we get from the effects of each layer. But do you see anything else about the layers that we have not seen? Please comment below 👇!
Using Selected Layers for Final Result
Having analyzed the individual effects of each layer, we selected the layers with which we would like to analyze interactions. The selected layers were layer 2 (seed/flower-like effects), 6 (super flower effects), and 9 (nice flower effects). For this combination of layers, analyzing random selections of weights, new transformations of the original image were obtained. Next, we present what we consider the most beautiful output image obtained. This image was obtained with weights
4, for layers 2, 6, and 9 respectively:
We can observe the beautiful cannabis-related effects that WeedDream automatically adds to the original image. Easily differentiable flower textures can be observed, as well as some seeds tend to appear.
Tell me, would you like to get your picture WeedDreamed? Comment on this post, and we will try to get the job done 🦾!
The database with which WeedCeption_v1 was trained was small and unbalanced. However, it was shown that when using the model to generate dreams, effects related to the training categories begin to appear. It can be concluded that it is completely possible to generate a “WeedDream” model.
As previously mentioned, DeepDream tends to add animal figures, since it was trained with an unbalanced amount of images for animals. In WeedDream something similar happens, since the number of flower images is times greater than the rest, “flowers” is the most distinguishable effect obtained. Retraining the model with a larger number of images, towards obtaining WeedCeption_v2, would definitely result in a more interesting analysis. Maybe Weedmaps or Leafly can help by allowing me to use their gigantic image database 😅🙏. We are optimistic that analyzing more images, as well as more categories, would result in amazing new effects.
All the work presented in this post can be easily replicated. The full code is available on GitHub. However, the images used for training are not mine, so we cannot include them in the repository.
I thank Nicolas Peretti, my trusted expert in Deep Learning, for the discussion and exchange of ideas.
by the author.