Basic models using Tensorflow
Lets dive into the code :
Line 1: __future__ is a pseudo-module which is used, to enable new language features which are not compatible with the current interpreter.
Line 2: Here’s where it starts. Lets import our tensorflow module.
Line 3: to use number matrices.
Line 4: because we are using graphs!
Line 5: to generate random numbers
Line 8: Learning rate is a decreasing function of time.
Line 9: number of training trails
Line 10: display the steps in which we want our output to be shown. In our case we want to display for every 50 steps.
Line 13: training dataset.
Line 14: testing dataset.
Line 15: number of training inputs
Line 20–21: Lets have our graph inputs.
Line 24: here the weights are being set to a variable (random number)
Line 25: here the bias is being set to a variable (random number)
Line 28:Constructing a linear model. our prediction is (Σ(XW + b))
Line 31: as the cost function we calculate the mean squared error. The equation represents:(Σ(pred — Y)²)/(2*number of training inputs)
Line 34: using Gradient descent as our learning algorithm.
Line 40–50: Alright, lets launch our graph! Create a session and fit in all the training data. Lets iterate over the training data and run the optimizer over the training input and training output. Display logs per each epoch!
Rest of the code: we print out the training cost, weights and bias values. We also display them graphically using matplotlib module. Likewise, we implement the model on the testing data and display them graphically.
Similarly we use tensorflow for logistic regression. Here’s the code:
Here’s the code for ‘k-nearest neighbours using tensorflow:
Here are some of the hacks:
- for tf.Variable you have to provide an initial value when you declare it.
- With tf.placeholder you dont have to provide an initial value and you can specify it at run time with the feed_dict argument inside Session.run
- If you need to perform any computation for your tensorflow graph, it must be done inside a tensorflow Session.
Alright that’s it for now! Thank you for spending your time. Cheers!