Visualizing SVM with Python
In my previous article, I introduced the idea behind the classification algorithm Support Vector Machine. Here, I’m going to show you a practical application in Python of what I’ve been explaining, and I will do so by using the well-known Iris dataset.
Following the same structure of that article, I will first deal on linearly separable data, then I will move towards no-linearly separable data, so that you can appreciate the power of SVM which lie in the so-called Kernel Trick.
Linearly-Separable Data
For this purpose, I’m going to use only two features and two classes of the Iris dataset (which contains 4 features and 3 classes). To do so, let’s first have a look at the correlation among features, so that we can pick those features and classes which guarantee a linearly-separable demo dataset.
import seaborn as sns
iris = sns.load_dataset("iris")
print(iris.head())
y = iris.species
X = iris.drop('species',axis=1)sns.pairplot(iris, hue="species",palette="bright")
Since I first want to deal with linearly separable and two-class data, I will focus on the graph petal_width-petal_length, for the classes Setosa and Versicolor.
Let’s visualize it in the form we are going to use:
df=iris[(iris['species']!='virg…