Page 216 - AI Computer 10
P. 216

Here, the negative prediction of no flood made by the model does not match with the reality of a flood. This is
            called a False Negative (FN).
            All the four values – True Positive, True Negative, False Positive, and False Negative are instrumental in calculating
            the various metrics involved in evaluation of an AI model.

            Confusion Matrix

            In simple words, a confusion matrix is a summary of prediction results by an AI model. A Confusion Matrix is a
            N × N matrix used for evaluating the performance of the model on the basis of two parameters: Prediction and
            Reality. It is useful because direct comparisons of values such as True Positive, False Positive, True Negative, and
            False Negative would be achieved. Let us draw the confusion matrix for the above examples of flood prediction.



















            In the Confusion Matrix:

             u the Prediction is plotted in the rows and has two values - Positive (Yes or 1) and Negative (No or 0).
             u the reality is plotted in the columns and has two values – Positive (Yes or 1) and Negative (No or 0).
             u each cell in the matrix contains the number of cases or data values that satisfied the conditions of both the
                row and column headings.

                  Knowledge Botwledge Bot
                  Kno
              Confusion Matrix is also known as the Error Matrix and is used in situations where we need to evaluate
              the performance of the model in terms of where it went wrong and help us in fi nding ways to increase the
              effi ciency of the model.

            Terminologies of Confusion Matrix

            To understand the confusion matrix, let’s consider a scenario involves whether the Indian Cricket team wins the
            World Cup or not.
             u Positive: The prediction is positive for the scenario. For example, you predict that the team wins the match.
             u Negative: The prediction is negative for the scenario. For example, you predict that the team does not win
                the match.
             u True Positive (TP): The predicted positive value matches the actual positive value. For example, you predict
                that the team wins the match and it actually wins the match.

             u True Negative (TN): The predicted negative value matches the actual negative value. For example, you
                predict that the team does not win the match and it actually does not win the match.
             u False Positive (FP): The predicted positive value does not match the actual negative value. For example,
                you predict that the team wins the match, but it does not win the match. This is also known as Type 1 Error.



                82
                82
   211   212   213   214   215   216   217   218   219   220   221