Linear Regression in Tensorflow

Kaustubh N
SomX Labs
Published in
2 min readMar 13, 2017

In the last post we learnt what is linear regression, now lets try to implement it in tensorflow.

Lets represent the linear regression model first.

Y = X * W + b

The above equation is the linear regression model. So in order to implement a linear regression model we have to implement the above equation in tensorflow code.

Note, there will be some boilerplate code that I will skip but you can see it in the github code linked at the end of the post.

Let’s start with getting data and normalizing it

# Read data
features,labels = get_data()
# Normalize features
normalized_features = normalize(features)
# Add bias term 'b' as 1 and reshape the data
f, l = add_bias_reshape(normalized_features, labels)

Once the data is read and preprocessed we will define learning rate and training steps

learning_rate = 0.01
training_epochs = 1000
loss_history = np.empty(shape=[1], dtype=float) # to record loss after each epoch
# Model parameters model being Y = W * X + b
X = tf.placeholder(tf.float32, [None, n_dim])
Y = tf.placeholder(tf.float32, [None, 1])
W = tf.Variable(tf.ones([n_dim, 1]))

Now lets define some variables that we will use inside tf.Session to execute.

init = tf.global_variables_initializer()y_ = tf.matmul(X,W) # linear model hereloss = tf.reduce_mean(tf.square(y_-Y))train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss)

Multiple optimizers are available in tensorflow, a few are mentioned below

tf.train.GradientDescentOptimizer
tf.train.AdagradOptimizer
tf.train.MomentumOptimizer
tf.train.AdamOptimizer
tf.train.ProximalGradientDescentOptimizer tf.train.ProximalAdagradOptimizer
tf.train.RMSPropOptimizer
And more

We will be using GradientDescentOptimizer to train the model and minimize on the loss. Tensorflow makes doing this super easy and can be done in just one line of code as seen above.

Now simply execute the tf code inside a session and print the mean square error.

with tf.Session() as sess:
sess.run(init)
for epoch in range(training_epochs):
sess.run(train_step, feed_dict={X:X_train, Y:Y_train})
loss_history = np.append(loss_history, sess.run(loss, feed_dict={X:X_train, Y:Y_train}))
y_pred = sess.run(y_, feed_dict={X:x_test})
mse = tf.reduce_mean(tf.square(y_pred - y_test))
print sess.run(mse)

Below is a plot of the fitted line

To visualize model in tensorboard we can use the filewriter to write the sess.graph to a file and view it in tensorboard.

Graph as seen in tensorboard

The code is available on GitHub, you can access it here. Contributions and suggestions are welcome.

To get in touch follow me at @kaustubhn

--

--

Kaustubh N
SomX Labs

Tinkerer, Machine Learning, Technology and Passion. Figuring out life!