Basic Recurrent Neural Network Tutorial — 3

Ting-Hao Chen
Machine Learning Notes
2 min readJan 8, 2018

We will use tf.nn.rnn_cell.BasicRNNCell + tf.nn.dynamic_rnn to build simple RNN.

If you are interested, the code (jupyter notebook and python file) of this post can be found here.

credit: https://blogs.biomedcentral.com/on-biology/2015/10/09/aging-brain-editor-discussion/

Build Basic RNN Cell with dynamic_rnn

Detailed explanations can be found previously using static_rnn. Here basically I’ll just replace static_rnn with dynamic_rnn.

Our input data:

# input data
X_data = np.array([
# steps 1st 2nd 3rd
[[1, 2], [7, 8], [13, 14]], # first batch
[[3, 4], [9, 10], [15, 16]], # second batch
[[5, 6], [11, 12], [17, 18]] # third batch
]) # shape: [batch_size, n_steps, n_inputs]

Define a RNN model:

# hyperparameters
n_neurons = 8

# parameters
n_inputs = X_data.shape[2]
n_steps = X_data.shape[1]

# rnn model
X = tf.placeholder(tf.float32, [None, n_steps, n_inputs])

cell = tf.nn.rnn_cell.BasicRNNCell(num_units=n_neurons)
output, state = tf.nn.dynamic_rnn(cell, X, dtype=tf.float32)
credit: https://indico.io/blog/general-sequence-learning-using-recurrent-neural-nets/

Train the model:

# initialize the variables
init = tf.global_variables_initializer()

# train
with tf.Session() as sess:
sess.run(init)
feed_dict = {X: X_data}
output_shape = sess.run(tf.shape(output), feed_dict=feed_dict)
state_shape = sess.run(tf.shape(state), feed_dict=feed_dict)
print('output shape [batch_size, n_steps, n_neurons]: ', output_shape)
print('state shape [batch_size, n_neurons]: ', state_shape)

The output:

output shape [batch_size, n_steps, n_neurons]:  [3 3 8]
state shape [batch_size, n_neurons]: [3 8]

--

--