Measure

confusion_matrix

The confusion matrix, or contingency table, is a table that summarizes the number of instances that were predicted to belong to a certain class, versus their actual class. It is an NxN matrix where N is the number of different class values, with the predicted classes in the columns and the actual classes in the rows. In the case of 2 class values (positive and negative), the fields in the matrix are respectively, from left-to-right, top-to-bottom, the number of true positives (TP), false negatives (FN), false positives (FP) and true negatives (TN). The number of correctly classified instances is the sum of diagonals in the matrix; all others are incorrectly classified (e.g. class ”a” gets misclassified as ”b”). See: http://en.wikipedia.org/wiki/Confusion_matrix The values of the confusion matrix are each labeled with the actual and predicted class, e.g. 'actual=pos, predicted=neg'.

Source Code:
See WEKA's Evaluation class


Properties

 Minimum value Maximum value Unit Optimization Higher is better