# Building Neural Network using keras for Regression

Jan 9, 2019 · 3 min read

In this post we will learn a step by step approach to build a neural network using keras library for Regression.

## Prerequisites:

Understanding Neural network

Activation functions

Evaluating the performance of a machine learning model

Linear Regression

For Regression, we will use housing dataset

Importing the basic libraries and reading the dataset. I have copied the data to my default Jupyter folder

`import numpy as npimport pandas as pdimport seaborn as snsimport matplotlib.pyplot as plt%matplotlib inlinedataset = pd.read_csv('housing.csv')dataset.head(2)`

We use describe method to get an understanding of the data

`dataset.describe(include='all')`

We do a pairplot for all the variable sin the dataset

`sns.pairplot(dataset)`

We create input features and target variables

`X=dataset.iloc[:,0:13]y=dataset.iloc[:,13].values`

All input features are numerical so we need to scale them. StandardScaler works well when the data is normally distributed. Based on the pair plot we see that the data is not normally distributed. Hence we use MinMaxScaler to scale the data

`from sklearn.preprocessing import  MinMaxScalersc= MinMaxScaler()X= sc.fit_transform(X)y= y.reshape(-1,1)y=sc.fit_transform(y)`

Creating the training and test dataset

`from sklearn.model_selection import train_test_splitX_train, X_test, y_train, y_test = train_test_split(X, output_category, test_size=0.3)`

Creating the neural network for the regressor. We have 13 input nodes, we create one hidden layer with 13 nodes and an output layer.

As this a regression problem, the loss function we use is mean squared error and the metrics against which we evaluate the performance of the model is mean absolute error and accuracy.

Mean absolute error is the absolute difference between the predicted value and the actual value.

we define a function build_regressor to use these wrappers. build_regressor creates and returns the Keras sequential model.

`from keras import Sequentialfrom keras.layers import Densedef build_regressor():    regressor = Sequential()    regressor.add(Dense(units=13, input_dim=13))    regressor.add(Dense(units=1))    regressor.compile(optimizer='adam', loss='mean_squared_error',  metrics=['mae','accuracy'])    return regressor`

We pass build_regressor function to the build_fn argument when constructing the KerasRegressor class. Batch_size is 32 and we run 100 epochs

`from keras.wrappers.scikit_learn import KerasRegressorregressor = KerasRegressor(build_fn=build_regressor, batch_size=32,epochs=100)`

We now fit the model to the training data

`results=regressor.fit(X_train,y_train)`

we now predict the data for test data

`y_pred= regressor.predict(X_test)`

Let’s plot the predicted value against the actual value

`fig, ax = plt.subplots()ax.scatter(y_test, y_pred)ax.plot([y_test.min(), y_test.max()], [y_test.min(), y_test.max()], 'k--', lw=4)ax.set_xlabel('Measured')ax.set_ylabel('Predicted')plt.show()`

Black broken line is the predicted values and we can see that it encompasses most of the values

Couple of tips for better accuracy

• Always standardize both input features and target variable. If we only standardize input feature then we will get incorrect predictions
• Data may not be always normally distributed so check the data and then based on the distribution apply StandardScaler, MinMaxScaler, Normalizer or RobustScaler

This tips are based on my experience

# Clap if you liked the article!

Written by

## Data Driven Investor

#### from confusion to clarity, not insanity

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just \$5/month. Upgrade