Predicted class Cat Dog Rabbit Actual class Cat 5 3 0 Dog 2 3 1 Rabbit 0 2 11
How can I calculate precision and recall so It become easy to calculate F1-score. The normal confusion matrix is a 2 x 2 dimension. However, when it become 3 x 3 I don’t know how to calculate precision and recall.
Answer
If you spell out the definitions of precision (aka positive predictive value PPV) and recall (aka sensitivity), you see that they relate to one class independent of any other classes:
Recall or senstitivity is the proportion of cases correctly identified as belonging to class c among all cases that truly belong to class c.
(Given we have a case truly belonging to “c“, what is the probability of predicting this correctly?)
Precision or positive predictive value PPV is the proportion of cases correctly identified as belonging to class c among all cases of which the classifier claims that they belong to class c.
In other words, of those cases predicted to belong to class c, which fraction truly belongs to class c? (Given the predicion “c“, what is the probability of being correct?)
negative predictive value NPV of those cases predicted not to belong to class c, which fraction truly doesn’t belong to class c? (Given the predicion “not c“, what is the probability of being correct?)
So you can calculate precision and recall for each of your classes. For multi-class confusion tables, that’s the diagonal elements divided by their row and column sums, respectively:
Attribution
Source : Link , Question Author : user22149 , Answer Author : cbeleites unhappy with SX