Page 218 - AI Computer 10
P. 218
Precision = True Positive
All Predicted Positives
Precision = TP
TP + FP
where, Total positive predictions = TP + FP
Precision is an important evaluation metric, and you should always remember that high precision indicates the
existence of more cases of True Positives as compared to False Positives.
u Recall: Like Precision, Recall is another parameter used for evaluating a model’s performance. It is defined
as the ratio of number of True Positives to the number of all actual positive cases. Thus, its formula can be
written as:
True Positive
Recall =
All Actual Positive Cases
TP
Recall =
TP + FN
where, Total actual positive cases = TP + FN
Knowledge Botwledge Bot
Kno
An AI model’s performance can be fully evaluated by determining both measures i.e., Precision and
Recall.
u F1 Score: F1 score can be defined as the harmonic mean of the Precision and Recall.
The formula for F1 Score is:
The best value or the perfect value for an F1 score is one (1) and the worst value is zero (0).
2 × Precision × Recall
F1 Score =
Precision + Recall
You should remember the following points about Precision, Recall, and F1 score.
u If Precision is low and Recall is low, then F1 score is low.
u If Precision is low and Recall is high, then F1 score is low.
u If Precision is high and Recall is low, then F1 score is low.
u If Precision is high and Recall is high, then F1 score is high.
Let us solve some problems involving calculation of the various evaluation metrics.
Example 1: In schools, a lot of times it happens that there is no water to drink. At a few places, cases of water
shortage in schools are very common and prominent. Hence, an AI model is designed to predict if there is going
to be a water shortage in the school in the near future or not. The confusion matrix for the same is:
The Confusion Matrix Reality: 1 Reality: 0
Predicted: 1 22 12
Predicted: 0 47 18
Calculate Accuracy, Precision, Recall, and F1 Score for the above problem.
84
84