[WEEK 4- Predicting the pieces of furniture in living rooms]

Mohammed ALI
bbm406f18
Published in
3 min readDec 23, 2018

Group Members: Mohammed ALI, Aybüke Yalçıner, HaticeAcar.

During this week, We have tried to implement more algorithms for our dataset and compare them with the result of convolutional neural networks then compare them with each other. We have looked for Multilayer Perceptron and KNN algorithms to make a classification between two objects.

The primary result of convolutional neural networks

As we see, we have some overfitting problems in our model which is implemented in convolutional neural networks algorithms. Currently, we are trying to collect more data and more hidden layers to avoid this issue.

a result of CNN in our code

Using Different Algorithms

We are working now to optimize our solutions and implement different algorithms such as multilayer Perceptron and KNN algorithm.

1- Multilayer Perceptron

Following are the major components of a Perceptron:

  • Inputs: All the features become the input for a PerceptronPerceptron. We denote the input of a perceptron by add all objects as input (we have ten objects)
  • Weights: Initially, we start the value of weights with some small value and these values get updated for each training error
  • activation function: to make the perceptron as linear as possible
  • Output: The summation of weighs are passed to the activation function and whatever value we get after computation is our predicted output.

Multilayer Perceptron Implementation

First, the features for an example are given as input to the Perceptron.

Second: These ten input features get multiplied by corresponding weights (starting with small value but not zero).

Third: The summation is computed for the value we get after multiplication of each feature with the corresponding weight.

Forth: The value of the summation is added to the bias.

Finally: The activation function is applied to the new value.

2- k-NN Algorithm

K nearest neighbor algorithm is the most simple machine learning algorithm. It works based on minimum distance from the query instance to the training samples to determine the K-nearest neighbors to be the prediction of the query instance. It’s a prohibitively slow algorithm. We have more than 800 images.

Conclusion:

Next week, we are going to explain in details the result of CNN and Perceptron algorithms

References

--

--

Mohammed ALI
bbm406f18

Passionate about data engineering and machine learning engineering.