Implement Neural Network without using deep learning libraries, step by step tutorial, python3

Amber
4 min readJan 19, 2019

--

Nowadays, Neural Network (NN) is a very powerful model and can be applied to various applications. This article provides a step-by-step tutorial for implementing NN, Forward Propagation and Backward propagation without any library such as tensorflow or keras.

  • We have discussed the concept of NN in week4 and week 5.
  • The completed source code used in this article is available here.

First Step: Preparing Dataset

We use the Iris dataset of scikit-learn for demonstration and visualize it as table.

Relabel Dataset

For convenience’s sake, we convert this problem into Binary Classification Problem by relabeling iris dataset — 1: Setosa, 0: not Setosa.

Plot Dataset

Here, we show the result of relabeling by plotting two pictures. One is with two sepal features (width and length), and the other is with two petal features (width and length).

The color of points represents the data’s label.

  • Original Label — Red: Setosa, Blue: Versicolor, Yellow: Virginica.
  • Relabel — Red: Setosa, Blue: Not Setosa.
sepal features
petal features

Surely we can use a simple logistic regression to solve this classification problem, but that’s not the point of this tutorial, our goal is to demonstrate how to design your Neural Network for solving classification problems.

Second Step — Building NN model

Here is the NN model we want to build. Input layer has 4 units as the features of Iris dataset are 4 — ‘sepal length (cm)’, ‘sepal width (cm)’, ‘petal length (cm)’, ‘petal width (cm)’. Output layer has only one unit since it is the binary classification problem— Setosa and Not-Setosa. For convenience’s sake, we use only one hidden layer with six units. Let’s start!

Step2.1 — Initialization of an NN model

Firstly, initialize the model and set the weighting matrix W¹, W².

Recall — A neuron (node/unit) is actually a logistic unit with Sigmoid (logistic) Activation Function(i.e simple logistic regression).

After Forward Propagation procedure, we can get the final value.

Step2.3 — Implement Loss Function (i.e. Cost Function)

In this problem, the loss function J(θ) we used is only for binary classification problem (e.g Setosa or Not-Setosa). The general form of J(θ) is discussed in week5.

Step2.4 — Implement Backward Propagation

We discussed the details of Backward Propagation in week5.

First, compute the Partial Derivative of J(Θ).

Second, update the weighting matrix W¹ and W² with the result of the Partial Derivative of J(Θ).

Step2.5 — Implement Training and Testing Stage

With the training weighting matrix W, we can execute Forward Propagation and get the final value, then we can use the value for prediction.

  • If the final value ≥0.5, predict the class is Setosa.
  • If the final value <0.5, the class is Not Setosa.

Third Step — Testing model

Summarize

The source code here is more flexible that you can add any hidden units and input units according to your dataset, but it’s only for binary classification problem. You can try it or extend it by yourselves. We hope this could give you more concrete concepts of NN. If you like it, please give me a 👏. Any feedbacks, thoughts, comments, suggestions, or questions are welcomed!

--

--