Measure

precision

Precision is defined as the number of true positive (TP) predictions, divided by the sum of the number of true positives and false positives (TP+FP): $$\text{Precision}=\frac{tp}{tp+fp} \, $$ It is also referred to as the Positive predictive value (PPV). See: http://en.wikipedia.org/wiki/Precision_and_recall Precision is defined only for a specific class value, and should thus be labeled with the class value for which is was computed. Use the mean_weighted_precision for the weighted average over all class values.

Source Code:
WEKA's Evaluation.precision(int classIndex)

 /**
   * Calculate the precision with respect to a particular class. 
   * This is defined as

*

   * correctly classified positives
   * ------------------------------
   *  total predicted as positive
   * 
* * @param classIndex the index of the class to consider as "positive" * @return the precision */ public double precision(int classIndex) { double correct = 0, total = 0; for (int i = 0; i < m_NumClasses; i++) { if (i == classIndex) { correct += m_ConfusionMatrix[i][classIndex]; } total += m_ConfusionMatrix[i][classIndex]; } if (total == 0) { return 0; } return correct / total; }

Properties

Minimum value0
Maximum value0
Unit
OptimizationHigher is better