Implementation of OR/AND/XOR operation in python using perceptron.

Pratik Somwanshi
3 min readJul 10, 2018

--

Today, we’ll implement the OR, AND, and XOR operations using a single discrete perceptron. Let us start straight.

Our perceptron contains two inputs and one output. We need to augment the inputs with one additional input which is set to (+/-)1. This is necessary so that we can draw a linear decision boundary anywhere on the plane. If we don’t specify this augmented input, the linear decision boundary will always pass through the origin. If we augment it by +1, it is known as bias and if we augment it by -1, it is known as threshold.

Here, we have two inputs which form four input combinations, X = [(0, 0), (0, 1), (1, 0), (1, 1)]. Now we augment each tuple with +1. Let the augmented list of input combinations be Y = [(0, 0, 1), (0, 1, 1), (1, 0, 1), (1, 1, 1)]. The desired output vector is d = [-1, 1, 1, 1] for OR operation. Here I consider ‘-1’ if the output of OR operation is ‘0’ and ‘+1' if output of OR operation is ‘1’ which is similar to signum function. Similarly desired output vector for AND operation is d = [-1, -1, -1, 1] and for XOR operation is d =[-1, 1, 1, -1]. I initialized weight vector to random small values, W = [-1, -2, -3].

Initialization of variables

We iterate through each input and find the output of the perceptron(say, out). We compare it with the desired output. If the output is correct, we don’t change the weight vector and if the output is incorrect we punish the perceptron by adjusting the weight vector accordingly. After processing each input, we update the error value by using error = error + 1/2 * (desired - out)². At the end of each epoch, i.e after iterating through all inputs, we check if the error is zero or not. If yes, we stop the training as we got the needed weight vector. If no, we once again iterate through the inputs, i.e next epoch/cycle. If the output of perceptron is 0, we consider it as wrongly classified as it lies in the decision boundary, so we assign the opposite of desired value to the output.

Output for OR operation

By changing desired vector to v = [-1, -1, -1, 1], we can get the weight vector for AND operation. For XOR operation, the desired vector is v = [-1, 1, 1, -1]. But in case of XOR, the loop runs infinitely because the XOR operation produces outputs that are not linearly separable. A single perceptron cannot achieve linear separability of XOR operation.

I’ve tried using no libraries for vector arithmetic to keep the code simple. Hope you like this. Comment below for any questions.

--

--