Only Numpy: Vanilla Recurrent Neural Network with Activation Deriving Back propagation Through Time Practice — part 2/2

Jae Duk Seo
3 min readDec 24, 2017

--

So today we are going to do the same thing but add one additional component which is activation function. For now lets keep it simple and use logistic function. (Note we will use the notation log(), and when implemented in python it could look like something below.)

import numpy as npfunction log(x):
return 1 / ( 1 + np.exp(-1 *x))
function d_log(x):
return log(x) * (1 - log(x))

Anyways here is the starting point, the training data (x) and ground truth data (y) and what we wish to do, count how many ones are there in a given line of data.

But here is the difference, the relationship between current state and the next state is defined below.

(E) is the equation defining the relationship between current state K and previous state k-1. Now let’s perform forward feed process, that is defined at (F). But wait, State 3 does not have a log() function? WHY?

Logistic Function outputs a number between 0 and 1, this means if we have more than one 1, the neural network cannot predict the correct number of ones!

Before moving on I want to do something different, to explain back propagation easier I want to divide some terms.

So the State can have State (Out) in which means the output of the state, and State (In) in which means the input of the state.

Now lets perform back propagation, again we are going to follow the same step as the first tutorial. Derive derivative respect to each Wx and Wrec at each time stamp.

Just like my previous post, try to derive the derivatives for time stamp at 1, by yourself. (It will really help). Also, can you spot the difference between having an activation function and not?

The colored boxes are the regions where we get the derivative for activation functions, and that is it!

Unfortunately, for now I don’t have a code to go with this math however, I still have a video tutorial as well as code for vanilla RNN without the activation functions.

Update Dec 23,2017 : Here is an interactive code follow this link

Code it Github: Link

Code: https://github.com/JaeDukSeo/Only_Numpy_Basic/blob/master/rnn/a_rnn_simple.py

Hope you learned something!

For more tutorial check out my website and my Youtube Channel!

Website: https://jaedukseo.me/

YouTube Channel: https://www.youtube.com/c/JaeDukSeo

--

--

Jae Duk Seo

Exploring the intersection of AI, deep learning, and art. Passionate about pushing the boundaries of multi-media production and beyond. #AIArt