# Confusion & Cost Matrix helps in calculating the accuracy, cost and various other measurable factors in classification problem

a. True +ve: If the person is actually having cancer (Actual class = Yes) and we predict correctly as Yes he is actually having a cancer (Predicted Class = Yes)

b. False -ve: If the person is actually having a cancer(Actual class = Yes) and we predict wrong as No, he is actually having a cancer (Predicted class = No)

c. False +ve: if the person is not having a cancer (Actual class = No) and we predict as Yes,he is actually not having cancer but we predict as he is having cancer, which is wrong (Predicted class = Yes)

d. True -ve: If the person is not having a cancer (Actual class = No) and we predict correctly as No, that means we predict correctly as the person is not having cancer (Predicted Class = No)

Confusion Metrics is used for evaluating the performance of classification problem

We can calculate the Accuracy as follows from confusion matrix:

We always need percentage of ‘a’ and ‘d’ to be maximum and percentage of ‘b’ and ‘c’ needs to be minimum so that the model is accurate or we can say accuracy is high

We can also call Accuracy as = Total Number of Correct predictions / Total Number of observations

Total Number of Correct predictions = a + d

Total Number of observations = a + b + c + d

Accuracy is always calculated in terms of percentage (0% to 100%) and in terms of fraction (0 to 1). Closer to 1 in terms of fraction is said to be better model.

However there are some Limitations of Accuracy:

Limitations of Accuracy

Let us consider a 2-class problem (0 and 1)

Suppose we are having below scenario:

Number of Class 0 examples = 9990

Number of Class 1 examples = 10

If the model predicts everything to be class 0, then

Accuracy is 9990 + 0/10000 = 99.9 %

Accuracy in this case will be misleading because model does not detect any class 1 example.

In this case we get accuracy as 99.9% but we cannot evaluate the performance of a model on the basis of accuracy because we were not able to predict class 1 examples.

If we see carefully the proportion of Class 0 examples is high which is 9990 and the proportion of class 1 examples is very low which is 10 for the 2-class problem.

Hence in the above case accuracy will not be a correct measure to evaluate the performance of model.

In this case we need to check ROC etc.

Cost matrix is similar to the confusion matrix except the fact that we are calculating the cost of wrong prediction or right prediction.

Example: Let us take an example of model which is used to predict someone having cancer or not having cancer (see below)

In Case (1) : Someone having a cancer is predicted to be as not having cancer and in Case(2) someone not having cancer is predicted to be having cancer

So what do you think what will be the cost of wrong predictions?

Obviously in both the cases cost will be involved

Case(1) will be more costlier compare to Case (2): Because somebody who is having cancer is predicted to be as not having cancer, will not be given medication on cancer, hence probability is more that he/she will die.

Whereas in Case (2) somebody not having cancer is predicted to be having cancer, he/she will be given cancer treatment/medication but over the period of time he will be discharged from hospital once we come to know that he is not having cancer.

So in both the cases cost is involved, however Case (1) will be more costlier compare to Case (2), because there is high probability that Case(1) patient might die.

So not all wrong predictions will have the same cost. It has to be dealt with some weight.

There are two models namely Model M1 and M2 both of which are having correct predictions and wrong predictions.

If we compare both the models and if we check their accuracy. Accuracy for Model M2 is higher compare to Model M1, however the cost for Model M2 is higher compare to Model M1.

So it depends on what kind of problem statement we are facing.

If we are focusing on accuracy then we will go with the Model M2 (In this case we need to compromise on cost) , however if we are focusing on cost then we will go with the Model M1 (In this case we need to compromise on accuracy).

Other Cost Sensitive Measures:

Precision(p) = a/a+c

Recall(r) = a/a+b

F-measure(F) = 2a/2a +b+c

More meausers:

True positive rate (TPR) (Sensitivity)

TPR = a/a+b (Sensitivity)

True Negative rate (TNR)(Specificity)

TNR = d/c+d (Specificity)

False Positive rate (FPR)

FPR = c/c+d

False Negative rate(FNR)

FNR = b/a+b

Written by

## More From Medium

### Decoding the Confusion Matrix

Jul 22, 2019 · 6 min read

#### 935

Dec 6, 2018 · 5 min read