Classification:Machine Learning

Anoop Singh
3 min readMar 31, 2018

--

Unlike regression where you predict a continuous number, you use classification to predict a category. There is a wide variety of classification applications in the field of sentiment analysis, ad targeting, spam detection, risk assessment, medical diagnosis and image classification. The core goal of classification is to predict a category or class y from some inputs x. Classification models include linear models like Logistic Regression, SVM, and nonlinear ones like K-NN, Kernel SVM and Random Forests.Classification is also a part of supervised learning.

Here we will understand and learn how to implement the following Machine Learning Classification models in python:

  1. Logistic Regression
  2. K-Nearest Neighbors (K-NN)
  3. Support Vector Machine (SVM)
  4. Kernel SVM
  5. Naive Bayes
  6. Decision Tree Classification
  7. Random Forest Classification

Implementation of Logistic Regression:

The Logistic function also called the sigmoid function was developed by statisticians to describe properties of population growth in ecology, rising quickly and maxing out at the carrying capacity of the environment. It’s an S-shaped curve that can take any real-valued number and map it into a value between 0 and 1, but never exactly at those limits.

Note: Logistic Regression is use linear classifier.

Importing the libraries:

import numpy as np
import matplotlib.pyplot as plt
import pandas as pd

Importing the dataset and sets it row on axis.

dataset = pd.read_csv(‘Social_Network_Ads.csv’)
X = dataset.iloc[:, [2, 3]].values
y = dataset.iloc[:, 4].values

Splitting the dataset into the Training set and Test set

from sklearn.cross_validation import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.25, random_state = 0)

For Feature Scaling

from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)

Fitting Logistic Regression to the Training set using LogisticRegression class

from sklearn.linear_model import LogisticRegression
classifier = LogisticRegression(random_state = 0)
classifier.fit(X_train, y_train)

Making the Confusion Matrix using confusion_matrix class

from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred)

Visualising the Training set results

from matplotlib.colors import ListedColormap
X_set, y_set = X_train, y_train
X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() — 1, stop = X_set[:, 0].max() + 1, step = 0.01),
np.arange(start = X_set[:, 1].min() — 1, stop = X_set[:, 1].max() + 1, step = 0.01))
plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
alpha = 0.75, cmap = ListedColormap((‘red’, ‘green’)))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
c = ListedColormap((‘red’, ‘green’))(i), label = j)
plt.title(‘Logistic Regression (Training set)’)
plt.xlabel(‘Age’)
plt.ylabel(‘Estimated Salary’)
plt.legend()
plt.show()

Output:

Visualising the Test set results

from matplotlib.colors import ListedColormap
X_set, y_set = X_test, y_test
X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() — 1, stop = X_set[:, 0].max() + 1, step = 0.01),
np.arange(start = X_set[:, 1].min() — 1, stop = X_set[:, 1].max() + 1, step = 0.01))
plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
alpha = 0.75, cmap = ListedColormap((‘red’, ‘green’)))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
c = ListedColormap((‘red’, ‘green’))(i), label = j)
plt.title(‘Logistic Regression (Test set)’)
plt.xlabel(‘Age’)
plt.ylabel(‘Estimated Salary’)
plt.legend()
plt.show()

Output:

You can access all code with DataSet here……

https://github.com/anoopsingh1996/Machine_Learning_Hands_On_Python

--

--

Anoop Singh

You have a right to perform your prescribed action, and not worry about the results