Analysis of Kernel in Support Vector Machine Classifier
SVM a simple enough classifier whose goal is to make hyperplane to classify data. And the best Hyperplane will be that which leaves maximum margin from both the classes .
Our motive is not to talk again of SVM theory but we will understand the different classification technique and their results when they were applied.So first i m gonna work on Social_Network_Ads.csv data to classify weather customer purchased or not and then check the accuracy using confusion_matrix .
https://github.com/halfbloodprince16/Machine-Learning-Algorithms/blob/master/SVM_AdsData.py
Check the code from my git links above.
Now lets see
1. Linear Classifier i.e classifier = SVC(kernel=’linear’,random_state=0).
So according to our confusion matrix we got 66+24 = 90 correct and 8+2 = 10 incorrect prediction and and accuracy of 90% .
2. Polynomial Classification ie : classifier = SVC(kernel=’poly’,random_state=0).
Accuracy= 86%
3.Sigmoid Function Classifier : ie. kernel=’sigmoid’
Accuracy = 74%
So we saw three different classification for the same data , there is one more reason that the Linear model give 90% accuracy because our data is in a linear form hence Linear classification best fits this model.
Thanks.