Analysis of Kernel in Support Vector Machine Classifier

VIGHNESH TIWARI
2 min readJun 5, 2018

--

SVM a simple enough classifier whose goal is to make hyperplane to classify data. And the best Hyperplane will be that which leaves maximum margin from both the classes .

SVM linear classifier with hyperplane equation.

Our motive is not to talk again of SVM theory but we will understand the different classification technique and their results when they were applied.So first i m gonna work on Social_Network_Ads.csv data to classify weather customer purchased or not and then check the accuracy using confusion_matrix .

https://github.com/halfbloodprince16/Machine-Learning-Algorithms/blob/master/SVM_AdsData.py

Check the code from my git links above.
Now lets see
1. Linear Classifier i.e classifier = SVC(kernel=’linear’,random_state=0).
So according to our confusion matrix we got 66+24 = 90 correct and 8+2 = 10 incorrect prediction and and accuracy of 90% .

LinearClassifier

2. Polynomial Classification ie : classifier = SVC(kernel=’poly’,random_state=0).

Accuracy= 86%

Polynomial Classification

3.Sigmoid Function Classifier : ie. kernel=’sigmoid’
Accuracy = 74%

Sigmoid Classifier.

So we saw three different classification for the same data , there is one more reason that the Linear model give 90% accuracy because our data is in a linear form hence Linear classification best fits this model.
Thanks.

--

--

VIGHNESH TIWARI

3rd Year Engineering Student from Army Institute of Technology , Pune , India (https://github.com/halfbloodprince16)