Understanding Confusion Matrix
Confusion Matrix (CM) describe the performance of a classification model
Let’s start with an example confusion matrix for a binary classifier
Suppose, We have 100 Apples and 50 Oranges. So total 150 items we have originally.
Let’s now define the most basic terms,
- true positives (TP): These are cases in which we predicted as Apple, and they are actual Apple.
- true negatives (TN): We predicted as Orange, and they are not Apple.
- false positives (FP): We predicted Orange, but they are Apple. (Also known as a “Type I error.”)
- false negatives (FN): We predicted Apple, but they are Orange. (Also known as a “Type II error.”)
Case : 1
1.In this scenario out of 100 Apples ,
Number of Apple selected : 100
Number of Orange selected : 0
2. Similarly for Orange,
Number of Orange selected : 50
Number of Apple selected :0
So Confusion Diagram will be below printed format,

Case : 2
1.In this scenario out of 100 Apples ,
Number of Actually Apple selected : 75
Number of mistakenly Orange selected : 25
2. Similarly for Orange,
Number of Actually Orange selected : 49
Number of mistakenly Apple selected : 1
So Confusion Diagram will be below printed format,

