Confusion Matrix is NOT Confusing!

Burcu Koçulu
Analytics Vidhya
Published in
2 min readMar 4, 2021

--

So you want to be a data scientist and you got into an interview for it. They asked you to evaluate this confusion matrix from diabetes data on your interview. What would you say ?

Let’s go step by step while looking at the GIF below, that i made. Our predictions can be Negative or Positive while Actual values can be True or False.

  1. Replace predictions -which is our second value- with :
    0 => Negative , 1 => Positive
    predictions.
  2. If patient is not diabetic and we have predicted that patient is not diabetic(N) right? Then we made a TRUE Prediction. (TN)
    If patient is diabetic and we have predicted that patient is diabetic(P), again we made a TRUE prediction. (TP)
  3. Now we have only False values left.
    If patient is diabetic, and we have predicted that patient is not diabetic(N), so we made a FALSE prediction. (FN)
    If patient is not diabetic, and we have predicted that patient is diabetic(P), again we made a FALSE prediction. (FP)
True(T) and False(F) are ACTUAL (y) values. — Positive(P) and Negative(N) are Predictions.

Now let’s check out our NOT confusing confusion matrix again !

True predictions : 77 TN, 36 TP
False predictions: 14 FN, 27 FP

Calculating evaluation metrics:

Accuracy :
77+36 / 77+36+14+27
= 0.73376..

Precision: 36 / 36+27 = 0.5714..

Recall : 36 / 36+14 = 0.72

F1 Score : 0.64

That’s all!

Thanks for reading!
Please don’t forget to comment if you have any feedback. :)

--

--