How to Calculate Confusion Matrix Manually.

Sadman Kabir Soumik
Analytics Vidhya
Published in
4 min readJul 1, 2020
plot of a confusion matrix

To understand the terminologies properly, I will take a simple binary classification problem. Let’s say, our dataset contains the product reviews of an e-commerce website. Each review has a label, either positive (1) or negative (0). Our task is to classify whether a review is positive or negative. Let’s assume, using different NLP techniques, we have made a good/bad model that can predict the labels somehow. For example, the below CSV file snap is the sample of our actual and predicted labels after the prediction that our model made.

fig 1: our sample product review predictions against actual true labels

In this dataset, 0 means it’s a negative review, and 1 means it’s a positive review. Here, we got our predicted labels using a machine learning model. I won’t explain any machine learning model or training/testing phase here in this article.

If you calculate the true_label, you will find there are 10 positive reviews (1) and 10 negative reviews (0). In the next column (predicted_label), we have our predictions which were made by our machine learning model. Those values are not exactly same as the actual labels (true_label).

You may have already seen somewhere else that we calculate confusion matrix using:
TP (True Positive)
TN (True Negative)
FP (False Positive)
FN (False Negative)

Now, what are these values?

True Positive (TP)

When the actual label (true_label) is positive (1) and your machine learning model also predicts that label as positive (1).
In our CSV file snap, in sample review 4, our actual label was positive (1), and our model also predicted that review as positive (1). So, it’s a TP value.

True Negative (TN)

When the actual label (true_label) is negative (0) and your machine learning model also predicts that label as negative (0).
In our CSV file snap, in sample review 2, our actual label was negative (0), and our model also predicted that review as negative (0). So, it’s a TN value.

False Positive (FP)

When the actual label (true_label) is negative (0) but your machine learning model predicts that label as positive (1).
In our CSV file snap, in sample review 1, our actual label was negative (0), but our model predicted that review as positive (1). So, it’s a FP value.

False Negative (FN)

When the actual label is positive (1) but your machine learning model predicts that label as negative (0).
In our CSV file snap, in sample review 3, our actual label (true_label) was positive (1), but our model predicted that review as negative (0). So, it’s a FN value.

Calculate the Confusion Matrix

Now, you know which values are what!

fig 2: TP, TN, FP, FN values of our model prediction

When you understand this, rest of the things are just simple math.

In our case, (calculate from the fig 2):
Total value of TP : 7
Total value of TN: 6
Total value of FP: 4
Total value of FN: 3

Calculate Accuracy

The formula for calculating accuracy of your model:

formula for calculating accuracy

If you place the values for these terms in the above formula and calculate the simple math, you will get the accuracy number. In our case, it is: 0.65
Which means the accuracy is 65%.

Calculate Precision

The formula for calculating precision of your model:

formula for calculating precision

Replace the values of these terms, and calculate the simple math, it would be 0.636

Calculate Recall | Sensitivity | True Positive Rate — TPR

formula for calculating recall or sensitivity

Replace the values of these terms, and calculate the simple math, it would be 0.70

Calculate the F1 Score

formula for calculating F1 score

Replace the values of these terms, and calculate the simple math, it would be 0.667

Calculate False Positive Rate — FPR

formula for calculating FPR

Replace the values of these terms, and calculate the simple math, it would be 0.4

ROC Curve

It is a graph generated by plotting False Positive Rate (FPR) in the X-axis, and True Positive Rate (TPR) in the y-axis.

Author: Sadman Kabir Soumik
LinkedIn: https://www.linkedin.com/in/sksoumik

--

--

Sadman Kabir Soumik
Analytics Vidhya

Artificial Intelligence | Cloud Computing | Back-End Engineering ☕️☕️