Confusion Matrix

Awab Idris
2 min readJul 10, 2018

--

Confusion matrix is a table used to investigate the performance of a classification model where the actual test values are known. It has two rows and two columns describing the true positives, false positives, false negatives and true negatives.

Confusion Matrix

Let’s say we have a classification model that predicts if someone has some kind of a virus or not, the confusion matrix will be very helpful in this situation to assess the performance of our model. The actual value illustrate the results from the actual blood test if it’s positive or negative. The predicted value illustrate what out model predicted if it’s positive or negative.

Describing the terminology for the confusion matrix:

True Positives(TP): When we predict that someone is positive and the actual result from the blood test is positive.

False Positives(FP): When we predict that someone is positive and the actual result from the blood test is negative.

False Negatives(FN): When we predict that someone is negative and the actual result from the blood test is positive.

True Negatives(TN): When we predict that someone is negative and the actual result from the blood test is negative.

We can also measure the performance of our model using other matrices derived from the confusion matrix which provide useful rates:

1- Accuracy(ACC): Percentage for correct predictions.

2- Misclassification Rate: Percentage for incorrect predictions.

3- Sensitivity(Recall): Percentage of correct predictions for the actual positives(True Positive Rate).

4- Specificity: Percentage of correct predictions for the actual negatives(True Negative Rate).

5- Precision: Among those i predicted positive, how many did I get correct?(Positive Predictive Value).

--

--